• fartsparkles@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    4 months ago

    If this passes, piracy websites can rebrand as AI training material websites and we can all run a crappy model locally to train on pirated material.

  • Rekorse@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    ·
    4 months ago

    Getting really tired of these fucking CEOs calling their failing businesses “threats to national security” so big daddy government will come and float them again. Doubly ironic its coming from a company whos actually destroying the fucking planet while it achieves fuck-all.

  • rumba@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 months ago

    Okay, I can work with this. Hey Altman you can train on anything that’s public domain, now go take those fuck ton of billions and fight the copyright laws to make public domain make sense again.

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Counter counterpoint: I don’t know, I think making an exception for tech companies probably gives a minor advantage to consumers at least.

        You can still go to copilot and ask it for some pretty fucking off the wall python and bash, it’ll save you a good 20 minutes of writing something and it’ll already be documented and generally best practice.

        Sure the tech companies are the one walking away with billions of dollars and it presumably hurts the content creators and copyright holders.

        The problem is, feeding AI is not significantly different than feeding Google back in the day. You remember back when you could see cached versions of web pages. And hell their book scanning initiative to this day is super fucking useful.

        If you look at how we teach and train artists. And then how those artists do their work. All digital art and most painting these days has reference art all over the place. AI is taking random noise and slowly making things look more like the reference art that’s not wholly different than what people are doing.

        We’re training AI on every book that people can get their hands on, But that’s how we train people too.

        I say that training an AI is not that different than training people, and the entire content of all the copyright they look at in their lives doesn’t get a chunk of the money when they write a book or paint something that looks like the style of Van Gogh. They’re even allowed to generate content for private companies or for sale.

        What is different, is that the AI is very good at this and has machine levels of retention and abilities. And companies are poised to get rich off of the computational work. So I’m actually perfectly down with AI’s being trained on copyrighted materials as long as they can’t recite it directly and in whole, But I feel the models that are created using these techniques should also be in the public domain.

  • A_norny_mousse@feddit.org
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    4 months ago

    Fuck Sam Altmann, the fartsniffer who convinced himself & a few other dumb people that his company really has the leverage to make such demands.

    “Oh, but democracy!” - saying that in the US of 2025 is a whole 'nother kind of dumb.
    Anyhow, you don’t give a single fuck about democracy, you’re just scared because a chinese company offers what you offer for a fraction of the price/resources.

    Your scared for your government money and basically begging for one more handout “to save democracy”.

    Yes, I’ve been listening to Ed Zitron.

    • supersquirrel@sopuli.xyz
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      gosh Ed Zitron is such an anodyne voice to hear, I felt like I was losing my mind until I listened to some of his stuff

      • dylanmorgan@slrpnk.net
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        Yeah, he has the ability to articulate what I was already thinking about LLMs and bring in hard data to back up his thesis that it’s all bullshit. Dangerous and expensive bullshit, but bullshit nonetheless.

        It’s really sad that his willingness to say the tech industry is full of shit is such an unusual attribute in the tech journalism world.

    • Rekorse@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      4 months ago

      It seems like their message was written specifically for the biases the current administration holds. Calling China PRC is an obvious example. So it was written by idiots for idiots apparently.

      • Pennomi@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        It’s only theft if they support laws preventing their competitors from doing it too. Which is kind of what OpenAI did, and now they’re walking that idea back because they’re losing again.

      • masterspace@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        4 months ago

        No it’s not.

        It can be problematic behaviour, you can make it illegal if you want, but at a fundamental level, making a copy of something is not the same thing as stealing something.

    • turnip@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Sam Altman hasn’t complained surprisingly, he just said there’s competition and it will be harder for OpenAI to compete with open source. I think their small lead is essentially gone, and their plan is now to suckle Microsoft’s teet.