Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

  • ReallyActuallyFrankenstein@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    1
    ·
    7 months ago

    Yep, it’s the RAM, but also just a mismatched value proposition.

    I think it’s clear at this point Nvidia is trying to have it both ways and gamers are sick of it. They used pandemic shortage prices as an excuse to inflate their entire line’s prices, thinking they could just milk the “new normal” without having to change their plans.

    But when you move the x070 series out of the mid-tier price bracket ($250-450, let’s say), you better meet a more premium standard. Instead, they’re throwing mid-tier RAM into a premium-priced project that most customers still feel should be mid-tier priced. It also doesn’t help that it’s at a time where people generally just have less disposable income.

  • LOLjoeWTF@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    ·
    7 months ago

    My Nvidia 1070 with 8gb vram is still playing all of my games. Not everything gets Ultra, nor my monitor isn’t 4K. Forever I am the “value buyer”. It’s hard to put money into something that is marginally better though. I thought 16g would be a no-brainer.

    • MeatsOfRage@lemmy.world
      link
      fedilink
      English
      arrow-up
      35
      ·
      7 months ago

      Exactly, people get to caught up in the Digital Foundry-ification of ultra max settings running at a perfect ~120 unlocked frames. Relax my dudes and remember the best games of your life were perfect dark with your friends running at 9 FPS.

      1080p is fine, medium settings are fine. If the game is good you won’t sweat the details.

      • Ragdoll X@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        7 months ago

        As someone who really doesn’t care much for game graphics I feel that a comment I wrote a few months ago also fits here:

        I’ve never really cared much about graphics in video games, and a game can still be great with even the simplest of graphics - see the Faith series, for example. Interesting story and still has some good scares despite the 8-bit graphics.

        To me many of these games with retro aesthetics (either because they’re actually retro or the dev decided to go with a retro style) don’t really feel dated, but rather nostalgic and charming in their own special way.

        And many other people also don’t seem to care much about graphics. Minecraft and Roblox are very popular despite having very simplistic graphics, and every now and then a new gameplay video about some horror game with a retro aesthetic will pop up on my recommended, and so far I’ve never seen anyone complain about the graphics, only compliments about them being interesting, nostalgic and charming.

        Also I have a potato PC, and it can’t run these modern 8K FPS games anyway, so having these games with simpler graphics that I can actually run is nice. But maybe that’s just me.

      • umbrella@lemmy.ml
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        edit-2
        7 months ago

        30fps is fine too on most games…

        friend of mine makes do with a gtx960@720p and is perfectly fine with it, the fun games run. even new ones.

        maybe an upgrade to digital foundry perfect 120fps would be worth it if it werent so damn expensive nowadays outside the us.

        • swayevenly@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          Not to shill for them but Alex makes it a point to run tests and to include optimized settings for non flagship hardware in every review he does. I’m not sure where your digital foundry nomenclatures are coming from.

          And no, 30fps is not fine…

      • ABCDE@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        7 months ago

        remember the best games of your life were perfect dark with your friends running at 9 FPS.

        The frame rate was shat on at the time and with good reason, that was unplayable for me. Best times were Halo 4-16 local multiplayer.

  • FiskFisk33@startrek.website
    link
    fedilink
    English
    arrow-up
    15
    ·
    7 months ago

    GPUs haven’t been reasonably priced since the 1000 series.

    And now there’s no coin mining promising some money back.

    • Sibbo@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      You mean Nvidia GPUs? I got my 6750XT for 500€, and I think it’s a good price for the performance I get.

  • Shirasho@lemmings.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    7 months ago

    I don’t know about everyone else, but I still play at 1080. It looks fine to me and I care more about frames than fidelity. More VRAM isn’t going to help me here so it is not a factor when looking at video cards. Ignoring the fact I just bought a 4070, I wouldn’t not skip over a 4070 Super just because it has 12GB of RAM.

    This is a card that targets 1440p. It can pull weight at 4k, but I’m not sure if that is justification to slam it for not having the memory for 4k.

    • Deceptichum@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      7 months ago

      I’m fine playing at 30fps, I don’t really notice much of a difference. For me ram is the biggest influence in a purchase due to the capabilities it opens up for local AI stuff.

      • iAmTheTot@kbin.social
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        7 months ago

        If someone says they don’t notice a difference between 60 FPS and 120+ FPS, I think… okay, it is diminishing returns, 60 is pretty good. But if someone says they don’t notice a difference between 30 and 60… you need to get your eyes checked mate.

        • Deceptichum@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          7 months ago

          I notice a difference, it’s just not enough to make it a big deal for me. It’s like going from 1080 to 1440, you can see it but it’s not really an issue being on 1080.

    • atocci@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      My monitor is only 1440p, so it’s just what i need. I ordered the Founders Edition card from Best Buy on a whim after I stumbled across it at launch time by coincidence. I’d been mulling over the idea of getting a prebuilt PC to replace my laptop for a few weeks at that point and was on the lookout for sales on ones with a 4070. Guess I’ll be building my own instead now.

    • ABCDE@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      7 months ago

      I think the only reason you’d really need that kind of grunt is on a 4K TV anyway, and even then you can use DLSS or whatever the other one is to upscale.

  • caseyweederman@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 months ago

    Remember when eVGA decided they would rather leave the market entirely than spend one more day working with Nvidia?

  • trackcharlie@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 months ago

    less than 20gb of vram in 2024?

    The entire 40 series line of cards should be used as evidence against nvidia in a lawsuit surrounding intentional creation of e waste

  • wooki@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    If they dont drop the price by at least 50% goodbye nVidia.

    So no more nVidia. Hello Intel.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      I don’t think they care. In fact I think they’re going to exit the consumer market eventually, it’s just peanuts to them and the only reason they’re still catering to it is to use it as field testing (and you’re paying them for the privilege which is quite ironic).

  • Kazumara@feddit.de
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    7 months ago

    600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.