New U.S laws designed to protect minors are pulling millions of adult Americans into mandatory age-verification gates to access online content, leading to backlash from users and criticism from privacy advocates that a free and open internet is at stake. Roughly half of U.S. states have enacted or are advancing laws requiring platforms — including adult content sites, online gaming services, and social media apps — to block underage users, forcing companies to screen everyone who approaches these digital gates.

  • awmwrites@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    91
    ·
    11 days ago

    Yeah, that headline is incorrect. The laws are there specifically to surveil adults, “for child safety” is a smokescreen justification. This isn’t a “we tried to do a good thing, but there’s this unfortunate side effect,” the surveillance was the goal.

      • Bronzebeard@lemmy.zip
        link
        fedilink
        English
        arrow-up
        29
        ·
        11 days ago

        Sure, no reason it couldn’t be, other than it isn’t.

        If they wanted to actually protect children, they’d be arresting the pedophiles, not forcing everyone to identify themselves on the Internet.

        • undrwater@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          5
          ·
          11 days ago

          Remember that “they” (at least here in the US) are as varied in opinion as are we.

          The ones who have a sincere desire to protect children want them to have limited exposure to content online.

          My personal thought is children should generally not be engaging others online, but it should be a social push (“don’t talk to strangers online” “don’t allow your children to be unsupervised online” ).

          As for arresting child abusers, we seem to be in the habit of putting them in high office.

          • paraphrand@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            ·
            edit-2
            11 days ago

            Children shouldn’t be on Reddit or Lemmy. And if anyone disagrees, they better not be complaining about idiots on those platforms.

            My statements are about participation, not about how the enforcement should be handled.

            • undrwater@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 days ago

              I agree, children shouldn’t be on tiktok or any of these kinds of platforms.

              And yes, it should be social pressure rather than legal pressure.

          • architect@thelemmy.club
            link
            fedilink
            English
            arrow-up
            5
            ·
            11 days ago

            Parents who care are doing that. The amount of parents who don’t give a fuck and will just put their id in for their kids to watch porn would shock you. Shit the amount of them that would sell their kids to sex slavery is higher than you probably think.

            Fuck this law they don’t give a fuck about kids.

            • undrwater@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              11 days ago

              When I was younger, there use to be “public service announcements” on TV that provided education for the kids and adults watching.

          • Bronzebeard@lemmy.zip
            link
            fedilink
            English
            arrow-up
            3
            ·
            10 days ago

            Those people should probably start parenting their own children instead of begging for the government to do it for them, then.

            And yes, it’s usually the same people.

  • homes@piefed.world
    link
    fedilink
    English
    arrow-up
    58
    ·
    11 days ago

    This has nothing at all to do with “child safety”. It’s all about data mining. And controlling what everyone can see or say online.

  • eli@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    11 days ago

    I can’t wait until it’s leaked that these child accounts are being used to target minors for whatever: ads, sexual exploitation, etc.

  • Kraiden@piefed.social
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    3
    ·
    11 days ago

    I’m going to copy and paste a comment I made elsewhere:

    The problem with age verification is VERY much in the implementation. It IS possible to do age verification without having to identify yourself to Meta/PornHub/Whoever. It IS possible to maintain privacy, AND restrict things like porn and social media to those who are of age. Look at how the Estonian system works, it’s brilliant. The problem isn’t age verification, it’s the blatant data grab that is currently trying to destroy your online anonymity…

    • lmmarsano@group.lt
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 days ago

      Still unnecessary & less effective than less invasive alternatives that already exist & the government could promote. To quote another comment

      Governments have commissioned enough studies to know that education, training, and parental controls filtering content at the receiving end are more effective & less infringing of civil rights than laws imposing restrictions & penalties on website operators to comply with online age verification. Laws could instead allocate resources to promote the former in a major way, setup independent evaluations reporting the effectiveness of child protection technologies to the public, promote standards & the development of better standards in the industry. Laws of the latter kind simply aren’t needed & also suffer technical defects.

      The most fatal technical defect is they lack enforceability on websites outside their jurisdiction. They’re limited to HTTP (or successor). They practically rule out dynamic content (chat, fora) for minors unless that content is dynamically prescreened. Parental control filters lack all these defects, and they don’t adversely impact privacy, fundamental rights, and law enforcement.

      Governments know better & choose worse, because it’s not about promoting the public good, it’s about imposing control.

      • Kraiden@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        As I’ve said elsewhere, yes in a perfect world it would be on the parents to enforce this, but that doesn’t mean we should do nothing on the social media side. It’s also the parents responsibility to prevent underage drinking and smoking, yet we still restrict those at the point of sale.

        I’m for age restrictions on social media, and yes there are arguments against it, but I’m not really interested in having that conversation.

        less invasive alternatives

        This is exactly what I take issue with. It’s a false dilemma. The assertion that you can’t have age verification without the invasion of privacy and destroying online anonymity in the process IS FALSE. You CAN have both. THIS is the grift in my opinion.

    • IratePirate@feddit.org
      link
      fedilink
      English
      arrow-up
      12
      ·
      11 days ago

      The problem is: you’re assuming they’re arguing in good faith when they say it’s about pRoTeCtInG tHe ChIlDrEn. It’s not. It’s a pretext for the data grab and mass surveillance of everyone. They will gladly take your argument, claim age verification is compatible with privacy and anonymity, and then introduce age verification systems that do implement mass surveillance. Don’t give them an inch.

      • Kraiden@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 days ago

        No I’m not! I’m in 100% agreement with you that this has nothing to do with protecting children! Age verification, if done properly, is a good idea, that I’m completely for. But you’re right, this isn’t that. This is a smokescreen.

        I just want to be sure that people understand that they ARE using a good idea as their cover here. It CAN be compatible with privacy and anonymity, and it is a good idea to stop young children engaging with the cess pit that is modern social media.

        At some point, I sincerely hope that the current regime will end and be replaced by something more sane. At that point, I don’t want people to immediately think “age verification = bad”

    • Kraiden@piefed.social
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      11 days ago

      And because someone will probably ask, this is my understanding of how it would work for age verification (I am not an expert):

      There are 3 parties in this scenario. The Estonian state, Meta, and a 3rd party (which is currently a real 3rd party, but work is being done to allow this to be a digital wallet on your device, that you control)

      The state issues your 3rd party a magic cryptographic cert that has all your personal data like dob

      Meta issue an age challenge: Not “what’s your dob” but rather “Are you old enough to use this service?”

      3rd party show you exactly what Meta are requesting and give you the option to approve or deny the request

      If you approve, the 3rd party generate a new cert that JUST says “Yes I’m of age” and nothing else.

      Because it’s been generated from the states magic cert it can be verified with their public key.

      Meta don’t get more info than they need, the state can’t see that you’ve logged into Meta, but you’ve successfully proved you’re old enough to use the service.

      The current weak point is that the 3rd party can absolutely see all of it, but there’s no reason the 3rd party has to be an external service. It could absolutely be an app on your device.

      You still need to prove yourself to the state, but you’d have to do that to get an id card in the first place. It’s WAAAAY better than trusting all the different porn sites and social media services individually to not leak or misuse your data

  • Pricklesthemagicfish@reddthat.com
    link
    fedilink
    English
    arrow-up
    13
    ·
    11 days ago

    The people who dont care kids are killed almost daily at school which they have no choice but attend. The people watch movies and buy products from businesses literally destroying the environment. The people who vote for people who rape and eat children to run the government. I totally believe you have the countries kids best interests in mind /s [x]

    • pdxfed@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      11 days ago

      They people who wave away bombing a school and killing 175 because “it’s war”(that we unilaterally started). These are definitely the people who you want to trust on child safety and well-being.

      • architect@thelemmy.club
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 days ago

        Nah man it’s not war didn’t you hear bro? It’s not war bro the department of defense War defense says it’s not brah!

  • doesit@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    ·
    11 days ago

    In the uk, you have to have your face filmed at different angles with your webcam. Spooky…
    Just to be able to access Reddit

  • StarryPhoenix97@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    11 days ago

    I’m already pissy about them constantly trying to have my cc data everytime I make on online purchase. Windows asks, Google asks, ebay ask, and then my browser asks.

    The answer is always no.

    To ID verification, My discord is a decade old. If you can’t do math well enough to guess that I wasn’t 8 when I made it then that’s a you problem.