ugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square297fedilinkarrow-up1998arrow-down116
arrow-up1982arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square297fedilink
minus-squareAdmiralRob@lemmy.ziplinkfedilinkEnglisharrow-up24arrow-down1·7 months agoTechnically, it didn’t print part of the instructions, it printed all of them.
Technically, it didn’t print part of the instructions, it printed all of them.