Teen boys use AI to make fake nudes of classmates, sparking police probe::Parents told the high school "believed" the deepfake nudes were deleted.

  • Mac@mander.xyz
    link
    fedilink
    English
    arrow-up
    73
    arrow-down
    3
    ·
    11 months ago

    There is absolutely no way anyone could have possibly seen this coming.

      • surewhynotlem@lemmy.world
        link
        fedilink
        English
        arrow-up
        56
        arrow-down
        3
        ·
        11 months ago

        You don't. Scissors and Polaroid and Playboy have been around for decades. If you wanted to see your classmates face on a nude and photocopy it, you could.

        Now it's just easier and more believable. But it's not any more stoppable.

            • idunnololz@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              6
              ·
              11 months ago

              There might be a misunderstanding. I understand the original post is trying to say that it was obvious problems like this will occur with the introduction of AI generated images but it also implies an easy or obvious solution. But there isn't one, so what is the point of pointing this out.

              • BlueÆther@no.lastname.nz
                link
                fedilink
                English
                arrow-up
                6
                ·
                11 months ago

                I don't read it as saying that there may be a simple solution? And I don't know how to attack the problem other that maybe a posable threat of distribution of material that could be classed as CSAM

                • idunnololz@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  2
                  ·
                  edit-2
                  11 months ago

                  Maybe an analogy would help clear this whole thread up. Let's say you wake up tomorrow and you see headlines of scientists discovering a meteor that will hit earth in the next 48 hours. Then a couple of days later you read a meteor hit earth causing X deaths and Y billions of dollars in damages. Then you go to the comment section and read "There is absolutely no way anyone could have possibly seen this coming." So then you're thinking to yourself does this comment seem a bit weird or am I just dumb for missing something. So you ask "could this have been prevented somehow" (subtext you don't really see anything obvious) but then you get confirmation it could not have been prevented so now you're just like "wait then wtf was the original comment saying".

                  And that is how I feel right now lmao.

      • saltesc@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        edit-2
        11 months ago

        Go back to using natural intelligence and try render with brain. Images can't be shared.

        • Melt@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          ·
          11 months ago

          Brain sends data to hand and hand render it with pen and paper, what now?

        • Kusimulkku@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          11 months ago

          This reminds me of the funny picture about a black person being angry that white people can think of slurs and there's nothing that can be done about it

  • LogicalDrivel@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    1
    ·
    11 months ago

    "…and their principal, Asfendis, has vowed to raise awareness on campus of how to use new technologies responsibly."

    Surely all the teenage boys will understand, and only use the technology for wholesome purposes.

    • averagedrunk@lemmy.ml
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      1
      ·
      11 months ago

      D.A.R.E. raised my awareness of drugs. I only used them for wholesome purposes.

    • Otter@lemmy.ca
      link
      fedilink
      English
      arrow-up
      15
      ·
      11 months ago

      They might know it's bad but not fully understand the potential harms. I made another comment on it

  • Otter@lemmy.ca
    link
    fedilink
    English
    arrow-up
    57
    ·
    edit-2
    11 months ago

    The other comment about how this has been happening for a long time (with low tech methods) is true, and it's also true that we can't stop this completely. We can still respond to it:

    An immediate and easy focus would be on what they do with the images. Sharing them around is still harassment / bullying and it should be dealt with in the same way as it currently is.

    There's also an education aspect to it. In the past, those images (magazines, photocopies, photoshop) would be limited in who sees them. The kids now are likely using free online tools that aren't private or secure, and those images could stick around forever. So it could be good to highlight that

    • Your friends and classmates may see them, and it may harm their lives. The images will likely stick around. Facial recognition algorithms are also improving, so it's a legitimate concern that an image stored on a random site somewhere will be tied back to them.
    • The images can be traced back to the creator and the creator can face repercussions for it (for those without empathy, this might be the better selling point
    • Adalast@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      11 months ago

      To your first point, much to the benefit of humanity, and counter to popular belief, the internet is NOT forever. Between link rot, data purges, corporate buyouts, transmission compression losses, and general human stupidity, large swaths of the internet have vanished. Hell, just Macromedia selling out to Adobe ended up causing the loss of most of the popular internet games and videos for anyone in their mid to late 30s at this point (you will be misses Flash). The odds of these specific AI-generated child porn pictures surviving even in some dark corner of the bright web are slim to none. And if they end up surviving in the dark web, well, anyone who sees them will likely have a LOT of explaining to do.

      Also, for the commentary of the websites keeping the images. That is doubtful, beyond holding them in an account-bound locker for the user to retrieve. They don't care and too many images get generated every day for them to see it as more than reinforcement training.

      Speaking of reinforcement training, they may have been able to use Photoshop's new generative fill to do this, but to actually generate fresh images of a specific peer they would have had to train a LoRA or Hypernerwork on photos of the girl so the SD could actually resolve it. They weren't doing that on an AI site, especially not a free one. They were probably using ComfyUI or Automatic1111 (I use both myself). They are free, open source, locally executed software that allow you to use the aforementioned tools when generating. That means that the images were restricted to their local machine, then transferred to a cell phone and distributed to friends.

      https://www.theatlantic.com/technology/archive/2021/06/the-internet-is-a-collective-hallucination/619320/

    • Emma_Gold_Man@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      11 months ago

      Your point 1 seems to forget something important: kids are often cruel, and bullying is frequently the point. So long term consequences for their classmates can be an incentive more than a deterrent.

    • cy_narrator@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      7
      ·
      11 months ago

      I think we should pressure EU to make it such that any online AI photo generating website also uses AI to make sure what was asked is not illegal.

  • DasherPack@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    3
    ·
    11 months ago

    In Spain it happened recently with some 12y/olds…it created a country-wide debate, and as always, did not lead to any regulation. Hopefully the EU will do something

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    10
    ·
    11 months ago

    This is the best summary I could come up with:


    This October, boys at Westfield High School in New Jersey started acting "weird," the Wall Street Journal reported.

    It took four days before the school found out that the boys had been using AI image generators to create and share fake nude photos of female classmates.

    Biden asked the secretary of Commerce, the secretary of Homeland Security, and the heads of other appropriate agencies to provide recommendations regarding "testing and safeguards against" producing "child sexual abuse material" and "non-consensual intimate imagery of real individuals (including intimate digital depictions of the body or body parts of an identifiable individual), for generative AI."

    "New York State currently lacks the adequate criminal statutes to protect victims of ‘deepfake’ pornography, both adults and children," Donnelly said.

    Until laws are strengthened, Bramnick has asked the Union County prosecutor to find out what happened at Westfield High School, and state police are still investigating.

    Until the matter is settled in the New Jersey town, the girls plan to keep advocating for victims, and their principal, Asfendis, has vowed to raise awareness on campus of how to use new technologies responsibly.


    The original article contains 950 words, the summary contains 184 words. Saved 81%. I'm a bot and I'm open source!

  • 6daemonbag@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    My niece had this same issue a few years ago but with Photoshop. It absolutely ruined her. Changed schools multiple times (public and private) but social media exists so all the kids knew. She ended up getting homeschooled for the last 5 years of school as well as a fuckload of therapy. She came out the other side okay but she has massive trust issues and anxiety

  • r3df0x ✡️✝☪️@7.62x54r.ru
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    If you’re making porn of real underage people, I have no problem with you being put on the pedo registry.

    If no serious harm was done, I’m fine with convicting them and then doing full expungement after 5-10 years.

    • Jolteon@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      I’d argue that someone making porn of someone their own age is not pedophilia.

    • wildginger@lemmy.myserv.one
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      And youre proof that the pedo registry shouldnt exist as is.

      Teenagers being sexually interested in their peers is not pedophilia, and you want to ruin a decade of their life guaranteed, with the “”“”““promise””“”“”" of an expungement that would never actually happen thanks to the permanent nature of the internet for it.

      This misuse of AI is a crime and should be punished and deterred, obviously. But labeling children about to enter the world as pedophiles basically for the rest of their lives?

      Youre kind of a monster.

        • wildginger@lemmy.myserv.one
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 months ago

          They are children. Being horny about classmates.

          Being sexually aroused by people your own age and wishing to fantasize about it is not enabling pedophilia, you literal psychopath.

          • r3df0x ✡️✝☪️@7.62x54r.ru
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            Circulating porn of minors is a crime and enables pedophiles. Not to mention teenage girls could easily commit suicide over something like this.

      • r3df0x ✡️✝☪️@7.62x54r.ru
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        What about the fact that the girls who are victims of something like this will have to contend with the pictures being online if someone posts them there? What if people who don’t know that the pictures depict minors re-post them to other sites, making them very difficult to remove? That can cause very serious employablity problems. It doesn’t matter how open minded people are, they don’t want porn coming up if someone googles one of their employees.