return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 1 month agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square35linkfedilinkarrow-up1210cross-posted to: [email protected]
arrow-up1210external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 1 month agomessage-square35linkfedilinkcross-posted to: [email protected]
minus-squaretidderuuf@lemmy.worldlinkfedilinkEnglisharrow-up2·1 month agoDamn that makes a lot of sense. Thx!
Damn that makes a lot of sense. Thx!