ugjka@lemmy.world to Technology@lemmy.worldEnglish · 1 year agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square299linkfedilinkarrow-up11.02Karrow-down116
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 1 year agomessage-square299linkfedilink
minus-squareelxeno@lemm.eelinkfedilinkEnglisharrow-up4·1 year agoTried to use it a bit more but it’s too smart…
minus-squareKairuByte@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up18·1 year agoThat limit isn’t controlled by the AI, it’s a layer on top.
minus-squareZerlyna@lemmy.worldlinkfedilinkEnglisharrow-up3·1 year agoYep, it didn’t like my baiting questions either and I got the same thing. Six days my ass.
Tried to use it a bit more but it’s too smart…
That limit isn’t controlled by the AI, it’s a layer on top.
Yep, it didn’t like my baiting questions either and I got the same thing. Six days my ass.