Sahwa@reddthat.com to Fuck AI@lemmy.world · 2 months agoAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comexternal-linkmessage-square27linkfedilinkarrow-up1202
arrow-up1202external-linkAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comSahwa@reddthat.com to Fuck AI@lemmy.world · 2 months agomessage-square27linkfedilink
minus-squareUndergroundGoblin@lemmy.dbzer0.comlinkfedilinkarrow-up18·2 months agofrom an ecological standpoint, though, it’s also a total disaster
from an ecological standpoint, though, it’s also a total disaster