• RxBrad@lemmings.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    So, the $1500-2000+ GPU tier will get even more expensive, with no competition for Nvidia. But then, people buying those have never cared about price. They’ll pay any price to say they have the biggest & fastest.

    I’m more concerned about $500 & under GPUs that I’d actually consider.

    • Infinity187@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      1 year ago

      I don’t buy them for bragging rights, I buy them because I want to future proof my builds. Idgaf about anyone knowing anything about my builds or costs. I simply want performance and quality. Don’t generalize.

        • CapraObscura@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          7
          ·
          1 year ago

          Who are completely fucking irrelevant to this discussion, but go on and list out a few more sub-niche instances that are exceptions to an obvious reference to whales buying whatever the most expensive thing is. Totally worth your time.

          • Scratch@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            1 year ago

            Hello, I am a gamer first, Blender modeller and animator, like 4th. I needed to upgrade my GPU to actually work in Cycles. So, as it was around by birthday, I bought a top end Nvidia card.

            I can buy a GPU for more than one reason.

            • CapraObscura@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              7
              ·
              1 year ago

              Good for you. Don’t care. Learn to fucking read like the part where the OP was clearly talking about goddamn whales.

              If you don’t know the difference and you get butthurt about this, you’re probably a fucking whale.

              Have fun posting your 4 second faster render times to CGTalk. I’m sure they’re all going to suck your dick for it.

              • Scratch@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                I know your just trying to troll and be obnoxious, but legitimately my render times dropped by an obscene amount.

                20 mins per frame down to about 30 seconds over a 250 frame animation is very significant for me.

                • CapraObscura@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  That’s nice.

                  What does it have to do with whales continually buying new shit they don’t need just to impress people? Nothing? Yeah, that’s kind of the point.

                  And with that kind of performance increase you’re not buying year over year, are you? Maybe every 3-5 years, possibly longer?

                  There’s literally nothing about any of these posts that pertains to you, and yet here you are.

  • Altima NEO@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    I mean, its not like AMD has been able to for several generations. I wouldn’t expect them to now.

    Though I wish they would, because nVidia is getting away with some bullshit pricing thanks to not having any competition.

  • Im28xwa@lemdro.id
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    AMD seems incapable of competing with Nvidia at the high end, they can’t make FSR as good as DLSS and they are still far behind in RT

    • mushroom@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      And AI/ML workloads. Nvidia gets lots of shit and is more expensive but you get a better ecosystem with their cards.

      • Verat@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        A good portion of this though is the CUDA stranglehold nvidia has. Good luck getting a neural net accelerated on OpenCL or Vulkan Compute.

        • Scratch@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          AMD do seem to be taking steps in the right direction here, still a while away from a more balanced landscape.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I think it’s more that they’re unwilling. AMD goes after low hanging fruit and targets the mass market. In essence, they’re willing to let NVIDIA invest in all of the new tech, and then they implement whatever gets popular.

      So unless they decide to truly prioritize their GPU business, they’ll be happy to target the quiet majority who care mostly about price to performance while focusing on innovating on the CPU side of the business where they make their real money.

      I’m sure they could compete on the GPU side if they threw money at the problem, but they don’t see a need to when it’s decently popular and they’re seeing a lot more growth and profit on the CPU side.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          And that’s how it has been for a long time at AMD.

          Look at CPUs, they were in a comfortable second place as the economy option for many years, and when they tried something new, it blew up in their face (Bulldozer).

          Ryzen was all about the chiplet design first, and architecture improvements second. They didn’t go for the most innovative core design or smallest process (they didn’t even have a fab), they went for the economical option (chiplets have better yields). They were able to catch up with Intel with IPC gains, but Ryzen was pretty uninteresting aside from that. Even today, Zen 4 is just an iteration on the chiplet design, and they’re beating Intel because Intel struggled with lithography issues, and Intel is also trying novel things that haven’t resulted in a clear win vs AMD. So AMD is happy to attack yields (chiplets) and innovate by extension (add-on cache) instead of trying something radical with core design.

          Their GPUs are going the same way. NVIDIA is trying hard with RT cores, whereas AMD mostly reused regular shader cores initially. NVIDIA is building a huge model for DLSS, AMD just applies a simple, one-size fits most filter on top. NVIDIA goes for the best experience for the high end, AMD just goes for a pretty good experience for most.

          I don’t see that changing, that has been AMD’s main playbook since Intel overtook them after the x64 transition.