Importantly, this took deepfake undressing from a tiny niche to a huge thing:

This means that it’s no longer a niche or really exceptional thing, but that harassment of women with this method is now pervasive.

  • Riskable@programming.dev
    link
    fedilink
    English
    arrow-up
    63
    ·
    28 days ago

    The real problem here is that Xitter isn’t supposed to be a porn site (even though it’s hosted loads of porn since before Musk bought it). They basically deeply integrated a porn generator into their very publicly-accessible “short text posts” website. Anyone can ask it to generate porn inside of any post and it’ll happily do so.

    It’s like showing up at Walmart and seeing everyone naked (and many fucking), all over the store. That’s not why you’re there (though: Why TF are you still using that shithole of a site‽).

    The solution is simple: Everyone everywhere needs to classify Xitter as a porn site. It’ll get blocked by businesses and schools and the world will be a better place.

      • Riskable@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        27 days ago

        Well, the CSAM stuff is unforgivable but I seriously doubt even the soulless demon that is Elon Musk wants his AI tool generating that. I’m sure they’re working on it (it’s actually a hard computer science sort of problem because the tool is supposed to generate what the user asks for and there’s always going to be an infinite number of ways to trick it since LLMs aren’t actually intelligent).

        Porn itself is not illegal.

        • a_non_monotonic_function@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          27 days ago

          He has 100% control over the ability to alter or pull this product. If he’s leaving it up while he’s generating illegal pornography that is on him.

          And no s*** I’m concerned about the illegal stuff.

    • mjr@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      28 days ago

      (though: Why TF are you still using that shithole of a site‽).

      Maybe some places don’t have alternative suppliers than Walmart? Similarly, some places have governments that still only use the porno social network for some services.

  • No1@aussie.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    28 days ago

    I misread the title and thought it meant thousands of Musk undressed images per hour.

    The horror!

  • PierceTheBubble@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    28 days ago

    Politicians are already hellbent on “age”-verifying social-media, but Elon seems believe there to be a lack of urgency in this regard… Please regulate social-media harder daddy! Please, we’ve had the resources to comply with these perverse regulations for a while now. I didn’t hijack this platform, just for the lefties to be able to speak their mind on alternative platforms…

  • DreamMachine@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    4
    ·
    28 days ago

    Scrolled @grok undress and bikini for a bit, most of it is girls jumping on the trend asking to change their own photos and humor.