• otp@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    edit-2
    5 months ago

    The laws regarding a lot of this stuff seem to ignore that people under 18 can and will be sexual.

    If we allow people to use this tech for adults (which we really shouldn’t), then we have to accept that people will use the same tech on minors. It isn’t even necessarily pedophilia on all cases (such as when the person making them is also a minor)*, but it’s still something that very obviously shouldn’t be happening.

    * we don’t need to get into semantics. I’m just saying it’s not abnormal (the way pedophilia is) for a 15-year old to be attracted to another 15-year old in a sexual way.

    Without checks in place, this technology will INEVITABLY be used to undress children. If the images are stored anywhere, then these companies will be storing/possessing child pornography.

    The only way I can see to counteract this would be to invade the privacy of users (and victims) to the point where nobody using them “”“legitimately”“” would want to use it…or to just ban them outright.

    • micka190@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      such as when the person making them is also a minor

      I get the point you’re tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it’s still possession.

      • BrianTheeBiscuiteer@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        And that’s still a bit messed up. It’s a felony for a teen to have nude pictures of themselves and they’ll be registered sex offenders for life and probably ineligible for most professions. Seems like quite a gross over reaction. There needs to be a lot of reform in this area but no politician wants to look like a “friend” to pedophiles.

          • micka190@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 months ago

            The issue is that the picture then exists, and it’s hard to prove it was actually destroyed.

            For example, when I was in high school, a bunch of girls would send nudes to guys. But that was 10 years ago. Those pictures still exist. Those dudes aren’t minors anymore. Their Messenger chats probably still exist somewhere. Nothing’s really preventing them from looking at those pictures again.

            I get why it’s illegal. And, honestly, I find it kind of weird that there’s people trying to justify why it shouldn’t be illegal. You’re still allowed to have sex at that age. Just don’t take pictures/videos of it.

            • BrianTheeBiscuiteer@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 months ago

              That makes complete sense except that stuff just does not register with teens. If a couple months in juvenile hall and 100 hours community service isn’t enough deterrent for a teenager then 5 years in jail and a lifelong label of “sex offender” won’t deter them. I recall seeing a picture of a classmate topless (under 18) and over 20 years later it finally dawned on me that it was child pornography.

              If we prosecuted every offender to the full extent of the law then like half of every high school class would be in jail. Not to say that something should be legal as long as enough people are breaking the law but if millions of kids are violating some of the strictest laws in the country we’re probably not getting the full picture.

    • vzq@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      28
      ·
      5 months ago

      That’s a lot of words to defend fake child porn made out of photos and videos of actual children.

      • NOT_RICK@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        arrow-down
        1
        ·
        5 months ago

        Reading comprehension not a strong suit? Sounds to me they’re arguing for protections for both adults AND minors.

      • Zorque@kbin.social
        link
        fedilink
        arrow-up
        10
        ·
        5 months ago

        That’s about the right amount of words to completely ignore the sentiment of a statement so you can make a vapid holier-than-thou statement based on purported moral superiority.

  • themeatbridge@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    5 months ago

    No reason not to ban them entirely.

    The problem is enforcing the ban. Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files? It would be trivial to host a site in a country without legal protections and make the software available from anywhere.

    • 520@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files?

      Problem with the former is that would outlaw any self hosted image generator. Any image generator is capable of use for deep fake porn

      • assassin_aragorn@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        5 months ago

        Perhaps an unpopular opinion, but I’d be fine with that. I have yet to see a benefit or possible benefit that outweighs the costs.

  • NutWrench@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    “We’re gonna ban Internet stuff” is something said by people who have no idea how the Internet works.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    This is the best summary I could come up with:


    Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bond-themed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.

    Since early last year, at least two dozen states have introduced bills to combat A.I.-generated sexually explicit images — known as deepfakes — of people under 18, according to data compiled by the National Center for Missing & Exploited Children, a nonprofit organization.

    nudification apps is enabling the mass production and distribution of false, graphic images that can potentially circulate online for a lifetime, threatening girls’ mental health, reputations and physical safety.

    A lawyer defending a male high school student in a deepfake lawsuit in New Jersey recently argued that the court should not temporarily restrain his client, who had created nude A.I.

    Under the new Louisiana law, any person who knowingly creates, distributes, promotes or sells sexually explicit deepfakes of minors can face a minimum prison sentence of five to 10 years.

    After learning of the incident at Issaquah High from his daughter, Senator Mullet reached out to Representative Orwall, an advocate for sexual assault survivors and a former social worker.


    The original article contains 1,288 words, the summary contains 198 words. Saved 85%. I’m a bot and I’m open source!

  • Daft_ish@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    5 months ago

    This is probably not the best context but I find it crazy how fast the government will get involved if it involves lude content but children are getting mudered in school shootings and gun control is just a bridge too far.

    • pro_grammer@programming.devOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 months ago

      I think they act faster on those matters because, aside from being a very serious problem, they also have a conservative agenda.

      Is very easy to say: “LOOK, WE ARE DOING THIS TO PROTECT YOUR CHILDREN FROM PEDOPHILES!!!”

      But they can’t just go and say “let’s enforce gun safety on schools”, because having a conservative voter reading “gun safety” will already go bad for them.

      They know they are sacrificing the well being of children by not acting on the school shootings, but for them is just the price of a few lives to stay in power.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      Are quaaludes even still available in 2024?

      Or did you mean to say “lewd”?

  • WhyDoYouPersist@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    5 months ago

    For some reason I thought it was mainly to protect Taylor Swift, with teen girls being the afterthought.