return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 24 days agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square35linkfedilinkarrow-up1210cross-posted to: [email protected]
arrow-up1210external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 24 days agomessage-square35linkfedilinkcross-posted to: [email protected]
minus-squareNoiseColor @lemmy.worldlinkfedilinkEnglisharrow-up2·24 days agoId still want to double check😀.
Id still want to double check😀.