ugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square300fedilinkarrow-up11.02Karrow-down116
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square300fedilink
minus-squareelxeno@lemm.eelinkfedilinkEnglisharrow-up4·7 months agoTried to use it a bit more but it’s too smart…
minus-squareKairuByte@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up18·7 months agoThat limit isn’t controlled by the AI, it’s a layer on top.
minus-squareZerlyna@lemmy.worldlinkfedilinkEnglisharrow-up3·7 months agoYep, it didn’t like my baiting questions either and I got the same thing. Six days my ass.
Tried to use it a bit more but it’s too smart…
That limit isn’t controlled by the AI, it’s a layer on top.
Yep, it didn’t like my baiting questions either and I got the same thing. Six days my ass.