Rep. Joe Morelle, D.-N.Y., appeared with a New Jersey high school victim of nonconsensual sexually explicit deepfakes to discuss a bill stalled in the House.

  • Blaidd@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 months ago

    Creating fake child porn of real people using things like Photoshop is already illegal in the US, I don’t see why new laws are required?

  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 months ago

    FOSTA is still in effect and still causing harm to sex workers while actually protecting human traffickers from investigation (leaving victims stuck as captive labor / sex slaves for longer). And it’d still regarded by our federal legislators as a win, since they don’t know any better and can still spin it as a win.

    I don’t believe our legislators can actually write a bill that won’t be used by the federal Department of Justice merely to funnel kids for the sake of filling prison cells with warm bodies.

    We’ve already seen DoJ’s unnuanced approach to teen sexting which convicts teens engaging in normal romantic intercourse as professional producers of CSAM.

    Its just more fuel for the US prison industrial complex. It is going to heavily affect impoverished kids caught in the crossfire while kids in richer families will get the Brock Turner treatment.

    This bill is wholly for political points and has nothing to do with serving the public or addressing disruption due to new technology.

    Until we reform or even abolish the law enforcement state, anything we criminalize will be repurposed to target poor and minorities and lock them up in unconscionable conditions.

  • Meowoem@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 months ago

    Think of the children being used to push an agenda that helps the very wealthy? Well I’ll be, what a totally new and not at all predictable move.

    Ban all ai that aren’t owned by rich people, make open source impossible, restrict everything that might allow regular people to compete with the corporations - only then will you children be safe!

    • TwilightVulpine@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      I’m as suspicious of “think of the children” stuff as anyone here but I don’t see how we are fighting for the rights of the people by defending non-consensual deepfake porn impersonation, of children or anyone.

      If someone makes deepfake porn of my little cousin or Emma Watson, there’s no scenario where this isn’t a shitty thing to do to a person, and I don’t see how the masses are being oppressed by this being banned. What, do we need to deepfake Joe Biden getting it on to protest against the government?

      Not only the harassment of being subjected to something like this seems horrible, it’s reasonable to say that people ought to have rights over their own likeness, no? It’s not even a matter of journalistic interest because it’s something completely made-up.

  • trackcharlie@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 months ago

    There are already laws against creating false content about people, so adding more laws isn’t going to make the previous laws more or less valid and it’s only going to waste time and money.

    Of course it’s being pushed by a “teen” since this teen clearly doesn’t have any understanding of the issues at hand, the technology at hand nor the laws that already exist to help them with this issue.

    It was up to the adults around this teen to help her navigate the issue and instead the incompetent pieces of worthless shit choose to push a new bill against AI rather than use the current legal framework that exists to actually help this girl.

    Anything to abuse a child or teens situation for their political gain. Worthless trash.

    • LWD@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Why did you put “teen” in scare quotes?

      It was up to the adults around this teen to help her navigate the issue

      Why do you assume they did not? What do you think they should do instead?

      Anything to abuse a child or teens situation for their political gain.

      Don’t you think it’s much stranger when people dismiss the abuse of children? For example, in this case: what do you think should have been done?

      • trackcharlie@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 months ago
        1. Pretending abortion is in the same realm as AI tool abuse is ridiculous and completely dishonest.

        .

        1. Scare quotes? Are you for real? Since you don’t understand, quotes denote emphasis or specificity, not emotion. “Teen”, the emphasis here, was used because teens are barely educated in legal matters and it’s their responsibility to seek assistance, not push legislation.

        .

        1. If the adults in that area had actually responded properly, there wouldn’t be an article about a new bill against AI, instead there’d be an article about defamation and debasement of a minor (likely by another minor, but that doesn’t mean the other minors parents are infallible in this situation, you are either parents or you’re not, if you are, you’re responsible for your brats actions, if you’re not, then the state will take the child and destroy them mentally, and likely physically, for your failures as a parent).

        .

        1. Not a single thing I said dismissed this, and I would go even further to say anyone pursuing the AI angle is the one dismissing this, especially given the laws that already exist that can be used to assist and protect the teen in question.

        .

        1. We’ve had adobe premier for more than 2 decades before generative AI and all of the issues surrounding deepfakes then cover every issue with generative AI now and wasting peoples time and money on this does not help improve the situation, instead it makes a mockery of the victim for the purposes of pushing an agenda.
        • LWD@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          7 months ago

          the state will take the child and destroy them mentally, and likely physically

          What the heck are you talking about here?

          Of course it’s being pushed by a “teen” since this teen clearly doesn’t have any understanding of the issues at hand…

          …you’re responsible for your brats actions…

          it makes a mockery of the victim

          I’m glad you care about the dignity of the victim.

    • Laticauda@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      7 months ago

      It’s being pushed by someone who was a victim of deep fake ai porn, so I think they understand the issues at hand just fine, you don’t have to agree with her, but don’t be a patronizing asshole about it.

        • LWD@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          edit-2
          7 months ago

          Saying “this teen clearly doesn’t have an understanding of the issues at hand” is.

          trackcharlie@lemmynsfw.com would call you “illiterate or a dishonest asshole” for missing that part of one of his walls of text, but honestly I think it’s understandable.

          • GBU_28@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            This is an actual law proceeding, with lawyers and adults involved. The teen is just the face of it.

      • trackcharlie@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        Thanks for not actually reading my comment and making it clear to everyone who did that you’re either illiterate or a dishonest asshole.

  • Monkey With A Shell@lemmy.socdojo.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Just wait until them tech savvy folks in Congress try to understand the difference between ‘deepfakes’ in the sense of pasting a new face on existing footage and whole cloth generative AI creating the entire scene, and then someone tells them that the latter is derived from multiple existing media sources. Gonna be some smoke pouring out of their ears like in the cartoons trying to slice up all the specifics.

  • Infiltrated_ad8271@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    If (as it seems) the point is not impersonation but damage to the person’s honor/image, where exactly is the line?

    If realism is the determining factor, what about a hyperrealistic human work? And if it is under human interpretation how realistic it should be, could a sketch be included?

  • guyrocket@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    How different is photoshopped fakes from AI fakes? Are we going to try to bad that too?

    ETA: *ban that too. Thx phone kb.

  • henfredemars@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 months ago

    This is just the tip of the iceberg of the threat AI poses to human social structures. We have yet to appreciate the gravity of what these new technologies enable. It’s incredibly dangerous yet equally naive to think that AI-generated porn laws will keep us safe.

    Firstly, the cat’s out of the bag. We can ban the technology or its misuse all we like, but can we really practically stop people from computing mathematical functions? Legal or not, generative AI can and will be used to generate content that hurts people. We need better planning for identifying, authenticating, and responding when this misuse happens.

    Secondly, we have an already huge, huge problem with fake news and disinformation. What is such a law for this special case of AI porn going to do for our inability to address harmful content?

    It’s a shame, but it strikes me as more feel-good than actually doing something effective.

  • fine_sandy_bottom@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    I really wonder whether this is the right move.

    This girl, and many others, are victims and I don’t want to diminish that, but I for better or worse I just don’t see how legislation can resolve this.

    Surely deepfakes will be just different enough to the subject to create reasonable doubt that it depicts the subject.

    I wonder whether, as deep fakes become commonplace, people might be more willing to just ignore it like any other form of trolling.

    • flipht@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      I think you’re right if the goal is to stop them all together.

      But what we can do is stop people from sending them around and saying that it’s true/actually the person.

      Once they’ve turned it from a art project into a weapon, it should have similar consequences to “revenge porn.”

      • HubertManne@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        I would think this would be covered by libel, slander, defamation type laws. The crime is basically lying about a persons actions and character.

    • galoisghost@aussie.zone
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      7 months ago

      It’s not trolling it’s bullying. You need to think beyond this being about “porn”. This is a reputational attack that makes the victim more likely to be further victimised via date rape, stalking, murder. These things already happen based on rumours, deepfakes images/videos will only make it worse. The other problem is that it’s almost impossible to erase once it’s on the internet, so the victim will likely never be free of the trauma or danger as the images/videos resurface.

      • fine_sandy_bottom@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        Trolling / bullying is just semantics, which I don’t think will help us very much.

        I think the heightened risk of other crimes is… dubious. Is that conjecture?

        Your position seems to be framed in the reality of several years ago, where if you saw a compromising video of someone it was likely real, while in 2024 the opposite is true.

        Were headed towards a reality where someone can say “assistant, show me a deepfake of a fictitious person who looks a bit like that waitress at the Cafe getting double teamed by two black guys”. I don’t claim to know all the ethical considerations, but I do think that changing social norms are part of the picture.

        I don’t have any authority to assert when anyone else should feel victimised. All I know is that in my own personal case, a few years ago I would’ve felt absolutely humiliated if someone saw a compromising video of me, but with the advent of deep fakes I just wouldn’t care very much. If someone claimed to have seen it I would ask them why they were watching it, and why in the world they would want to tell me about their proclivities.

    • Overzeetop@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      I think it doesn’t go far enough. Straight up, no one should be permitted to create or transmit the likeness of anyone [prior to, say, 20 years following their death] without their explicit, written permission. Make the fine $1,000,000 or 10% of the offender’s net worth, whichever is greater; same penalty and corporate revocation for any corporation involved. Everyone involved from the prompt writer to the work-for-hire people should be liable for the full penalty. I can’t think of a valid, non-entertainment (parody/humor), reason for non-consensual impersonation - and using it for humor or parody is a slippery slope to propaganda weaponization. There is no baby in this tub of bathwater.

      • TimeSquirrel@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        Yeah, just like the FBI warnings on VHS tapes about massive fines and jail time stopped us from copying them in the 80s and 90s…

    • AlteredStateBlob@kbin.social
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      7 months ago

      My dude there are people out there thinking they’re in a relationship with Johnny fucking Depp because some Nigerian scammer sent them five badly photoshopped pictures. Step out of your bubble, maybe. This shit isn’t easy to spot for the vaaaaaast majority of people and why would this lie with the victim to sort of clear their name or hope that idiots realize it’s fake?

      Especially with and around teenagers who can barely think further than their next meal?

      Good lord.

      • fine_sandy_bottom@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        WDYM “step out of your bubble”?

        It’s not a question of being able to detect whether or not a video is fake. When deepfakes become so prevalent that everyone’s grandma understands that they’re prevalent, it won’t matter whether you can identify the video as fake.

  • hydration9806@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    I feel we are in need of a societal shift here, just like another commenter said about the printing press. When that first came out, the pushback was from the worry that the words would be attributed to someone who never said them (reverse plaigerism). The societal adjustment to this was the universal doubt that anyone said that thing without proof.

    For generative AI, when it becomes widespread, photos will be generateable for literally everyone, not just minors but every person with photos online. It will be a societal shift; images will be assumed to be AI generated, making any guilt or shame about a nude photo existing obselete.

    Just a matter of time so may as well start now!

    • henfredemars@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      It’s like how DRM only hurts people who purchase content legally.

      It’s been very illegal to pirate games for decades, and still pirated content is quite common in the wild. What’s banning it (defamation) harder going to practically achieve?