Teen boys use AI to make fake nudes of classmates, sparking police probe::Parents told the high school "believed" the deepfake nudes were deleted.

  • wildginger@lemmy.myserv.one
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    And youre proof that the pedo registry shouldnt exist as is.

    Teenagers being sexually interested in their peers is not pedophilia, and you want to ruin a decade of their life guaranteed, with the “”“”““promise””“”“”" of an expungement that would never actually happen thanks to the permanent nature of the internet for it.

    This misuse of AI is a crime and should be punished and deterred, obviously. But labeling children about to enter the world as pedophiles basically for the rest of their lives?

    Youre kind of a monster.

      • wildginger@lemmy.myserv.one
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 months ago

        They are children. Being horny about classmates.

        Being sexually aroused by people your own age and wishing to fantasize about it is not enabling pedophilia, you literal psychopath.

        • r3df0x ✡️✝☪️@7.62x54r.ru
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Circulating porn of minors is a crime and enables pedophiles. Not to mention teenage girls could easily commit suicide over something like this.

    • r3df0x ✡️✝☪️@7.62x54r.ru
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      What about the fact that the girls who are victims of something like this will have to contend with the pictures being online if someone posts them there? What if people who don’t know that the pictures depict minors re-post them to other sites, making them very difficult to remove? That can cause very serious employablity problems. It doesn’t matter how open minded people are, they don’t want porn coming up if someone googles one of their employees.