• FauxLiving@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 days ago

    You can run local models that will do this without being gaslit.

    Manipulating chatbots to bypass their refusal conditioning is pretty simple, you can find copy paste blocks of text that will work on most public models.

    You’re likely to get your account banned as there are other, non-LLM, systems searching your chatlog for banned terms specifically to address these kinds of jailbreaks.