• Avid Amoeba@lemmy.ca
    link
    fedilink
    arrow-up
    112
    ·
    4 months ago

    For newer GPUs from the Turing, Ampere, Ada Lovelace, or Hopper architectures, NVIDIA recommends switching to the open-source GPU kernel modules.

    So 20-series onwards.

      • Irremarkable@fedia.io
        link
        fedilink
        arrow-up
        31
        ·
        edit-2
        4 months ago

        Maybe it’s just because I’m older and more jaded, but that really feels like the last truly good era for GPUs.

        Those 10 series cards had a ton of staying power, and the 480/580 were such damn good value cards.

        • Telorand@reddthat.com
          link
          fedilink
          arrow-up
          32
          ·
          4 months ago

          It’s more that back then was a better time for price to performance value. The 3000 and 4000 series cards were basically linear upgrades in terms of price to performance.

          It’s an indicator that there haven’t been major innovations in the GPU space, besides perhaps the addition of the AI and Raytracing stuff, if you want to count those as upgrades.

        • GolfNovemberUniform@lemmy.ml
          link
          fedilink
          arrow-up
          12
          ·
          4 months ago

          RTX 3050 (which got a new 6 gb version less than a year ago) is similar to 1070 Ti in terms of performance and 1080s are of course even better. Definitely a ton of staying power, even in 2024.

          • Norah - She/They@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            3
            ·
            4 months ago

            I bought a secondhand 1080 a couple years ago when the crypto bubble burst finally and it’s still serving my needs just fine. It could play Baldur’s Gate 3 just fine on release last year, which was the last “new” game I played on it. Seems like it’ll still be good for a few years to come so yeah.

        • fuckwit_mcbumcrumble@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          8
          ·
          4 months ago

          That was mostly because the 20 series was so bad. Expensive, didn’t perform lightyears better to justify the price, raytracing wasn’t used in any games (until recently).

          The 30 series was supposed to be more of a return to form, then covid + mining ruined things.

          • Dreyns@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            4 months ago

            I got a 2060 super and i must say i’m very happy, i do 3d stuff so the ray tracing was plenty useful and despite it getting a bit it fairs pretty great in most games and the price was okay at the time (500 €still a bit high since it was during the bitcoin mining madness =-=")

      • Zoot@reddthat.com
        link
        fedilink
        arrow-up
        3
        ·
        4 months ago

        Yep! My pre-built 1660 super i got years ago is still chugging along amazingly as a streaming device for my steam deck.

      • Skull giver@popplesburger.hilciferous.nl
        link
        fedilink
        arrow-up
        20
        ·
        4 months ago

        It’s not really planned obsolescence, they changed the way their drivers work with the 16xx/20xx series. Up till the 10xx series, they did a lot of algorithms and processing in the software. Then they switched to doing most of that on the GPU in the form of firmware. The 10 series GPUs can’t do that.

        Like most hardware vendors, Nvidia doesn’t want (and probably isn’t allowed to) publish all of their special sauce source code. They can open source the driver and load a binary blob like, most hardware does, but only on the newer cards.

        The older cards will have to keep the special sauce in software on the CPU, so those devices will need to stick to the proprietary driver.

        • Norah - She/They@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          4 months ago

          (and probably isn’t allowed to)

          I doubt very much it’s about whether they are allowed too or not. They’re the ones at the top of the hardware supply chain, designing their own chips and having them fabricated. It’s them telling other companies, like Gigabyte and EVGA, what they are allowed or not allowed to do.

          • Skull giver@popplesburger.hilciferous.nl
            link
            fedilink
            arrow-up
            11
            ·
            4 months ago

            Nvidia also buys and licenses code from other companies. These days they’re on top of the chain, but they used to be a lot smaller. Maybe they rewrote their drivers to remove the external code, but I wouldn’t be surprised if they still have old external code in their drivers.

            AMD tried to open their source code for a display technique (VRR I think? Not sure what it was) but was prevented from doing so by the standards authority, presumably because they used licensed reference code. I don’t think this applies to the older 10xx series of cards, but these factors are difficult to work around.

            • Ptsf@lemmy.world
              link
              fedilink
              arrow-up
              8
              ·
              4 months ago

              Hmdi 2.1 and the hdmi consortium prevented them from releasing code. It wasn’t even proprietary, just based on a licensed implementation from what I understood.