• LoafedBurrito@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    2 months ago

    Ruining the PC market for consumers on purpose so people will think it’s cheaper to rent computers than to own.

    In the future, you will lease your computer and not own it, just as you are told to do by the billionaires who steal your pay.

    • SourGumGum@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      In the future you will connect to a corporate owned terminal and use an online hosted OS, where your files are kept in their cloud ecosystme.

  • rogsson@piefed.social
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    2 months ago

    When the yet-to-be data centers never get built because AI slop bubble pops, we will be able to build houses out of RAM sticks for the poor

    • veni_vedi_veni@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 months ago

      the problem with data center hardware is that they are often bespoke and nowadays can’t be reused in a consumer context. Think about those headless GPUs, they probably making these RAM modules with a different interface.

      They will just be e-waste instead of having the possibility of being surplus.

      • hark@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Thanks! I was blown away by the quality of voice work in a game back then. Combined with the story, it was a real treat.

        • trongod_requiem0432@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          I only watched a playthrough to be honest, because the game wasn’t available for my pc and the gameplay seemed kinda outdated, but fuck yeah! The story and voice work really rocked! I loved it as well. Don’t get me started on OST. Quite the movie material in my opinion.

  • Zarajevo@feddit.org
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    2 months ago

    US oligoarchys wants to have all computation done in their warehouse so they have to power to change any computation at any time

    • MadBits@europe.pub
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      2 months ago

      When I first saw “GeForce Now” that’s exactly what I imagined. Building a market for cloud computation. “Just own the display, we will rent you the brain for it”. Currently they are choking the market with orders for ram that does not exist, paid with money that does not currently exists and the output of this situation is that prices go up artificially which will eventually drive the users exactly in a scenario where they will rent out computing power “for cheap” to play that latest game for 2-3 hours.

  • MochiGoesMeow@lemmy.zip
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 months ago

    This is crazy because I bought 64 gb ram for like 130 in July of 2025.

    Now it’s 530 for the same exact brand.

    • Felis_Rex@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 months ago

      I’m kicking myself so hard right now. I went in and upgraded my PC when word of tarrifs were coming back in 2024 but decided to chill on ram because the prices were still low and I felt I had enough. I have 16gb of DDR5 and my rig feels like it’s chugging with newer software.

      I should’ve bit the bullet and future proofed with a 64gb set up. My ass decided to buy a house instead 😮‍💨

    • innermachine@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Dude 2 years ago I built my first PC in a minute and last year built one for wifey (was running a rx580, ddr3 ram and fx6300 that I was milking as long as I could, built my rig and wife used that one till she had me build a better one on ddr4 ram ryzen 5 and pcie 4.0 mobo etc) and didn’t think it was a great time to build because of prices being all over but now I’m just glad I didn’t wait any longer!

      • MadBits@europe.pub
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        I really hope so but I can’t help but to think that they are going to drag it for as long as possible, because no matter how bad the situation is for the common folk, they are still going to make a profit off of it.

      • jaykrown@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        2 months ago

        To be clear, that doesn’t mean AI is going away. It just means no one is actually going to pay for AI models anymore because open-weight free models will be extremely cheap and powerful.

        • iglou@programming.dev
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 months ago

          It also means that AI in places where it brings nothing and in many cases makes the product actually worse will disappear

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 months ago

      I did my desktop but skipped my server.

      Even decade+ old used surplus server DDR4 didn’t escape the apocalypse.

    • hdsrob@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      Same … I hadn’t upgraded since 2012, and had some extra cash, so rebuilt in August. Feeling pretty lucky to have done it then, and really glad I went ahead and put 64GB RAM in it.

    • njordomir@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Me too, I added more than I could use because today’s gaming rig is tomorrow’s server. Now I’m debating if I should sell a few sticks but who knows when, if ever, I’ll be able to replace them.

      • 9point6@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I more meant now they’re not being made because Micron recently killed the Crucial brand to focus supply towards data center customers

        • Prove_your_argument@piefed.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I’m well aware, but everybody knows the HBM demand will dry up eventually and that eventually the consumer market will be worth trying to profit from again.

          They just want to manipulate the consumer market to maximize margins. If they can get memory prices to stay at 200-300% for a while, they can up the prices they charge and raise margins to stratospheric heights not before seen on the consumer market. All manufacturers jump on stuff like this when they can get away with it.

          Memory manufacturers still order from micron directly for their own branded chips. Those margins will increase for all parties. Ai data center demand is like Christmas for the entire industry. No pricing is transparent and every vendor is profiteering.

  • 1984@lemmy.today
    link
    fedilink
    English
    arrow-up
    14
    ·
    2 months ago

    Im on Linux and it requires just as much memory as it did in 2018. No problem here.

  • Professorozone@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 months ago

    Yeah, I bought 32GB of RAM about three months ago for my new computer build and last I checked it had doubled in price. Thinking about selling it for a profit. Can a computer run without RAM?

  • palordrolap@fedia.io
    link
    fedilink
    arrow-up
    7
    ·
    2 months ago

    The DDR4 sticks I got 18 months ago now cost 300-400% the price they were, so it’s not just DDR5.

    … and I just realised the title doesn’t actually mean “DDR5 prices”, but that was an easy misinterpretation on my part, so I guess I’ll post this anyway.

  • melfie@lemy.lol
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    2 months ago

    2026 is going to suck for hardware, but 2027 might be better if this nonsense blows over. For one thing, AMD’s RDNA 5 was announced for 2027, which is supposed to be more comparable to Nvidia for compute workloads, including real RTX cores. AMDs recent SoCs have been pretty impressive, so I’m looking forward to AMD SoCs that are competitive with Nvidia discrete GPUs beyond just rasterization, except without artificially constrained VRAM and lower power requirements.

    • Barracuda@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 months ago

      Who produces the chips that make AMD products? They are the bottleneck. If those fabs are already overloaded, a new product won’t help in any way.