• BartyDeCanter@lemmy.sdf.org
      link
      fedilink
      arrow-up
      12
      ·
      14 days ago

      Oh fuck yeah, especially for laptops. I was pleasantly surprised when all of the hardware on my new to me 13 year old laptop just worked out of the box with Debian 13. I was expecting to have to fix something.

    • pedz@lemmy.ca
      link
      fedilink
      arrow-up
      6
      ·
      14 days ago

      Depends how old. I have a Phenom system with an iGPU and an audio chip that went unsupported fow a few years. Then after a few cycles of updates, it became supported again.

      Same with the GPU of an old laptop with an Optimus system. At some point nothing would be working correctly but then new nouveau (huh) modules got out and this old hardware could suddenly work much better than before.

      Apparently I have a lot of hardware that goes through a phase of being unsupported in Linux for a while, to working better than ever before.

  • I am running a Ryzen 5 3600 (not a 3600X), and a GTX 1660 Super.

    It still gets the job done. And when it doesn’t, it’s in games that use DLSS as a crutch instead of properly optimizing it. And fuck those games.

    • wltr@discuss.tchncs.de
      link
      fedilink
      arrow-up
      2
      ·
      13 days ago

      A friend advised me to sell mine when the GPU prices were crazy, I could get like $200 for it, or a bit more (years prior to that, I bought it for $150 from another friend, used, he tried to mine some Etherium with it). I was lazy, and perhaps the friend was right, I could buy something better for the price now. But I’m really satisfied with the card and feel no need to upgrade any time soon. Runs everything I want.

  • yesman@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    13 days ago

    The truth is that 10+ yo computer built with an eye to gaming is adequate and satisfactory for 85% of what you’d use a computer for. So long as you’ve done basic (and cheap) upgrades like SSD/M.2. Don’t get mad at me, but I picked up a RAM kit over the summer because it was so affordable. (DDR4)

    But I’m not going to roll the hardware enthusiasts that act as early adopters for bleeding edge tech that I can afford used or surplus five years later. And they’re exactly the people getting railed the hardest by the component shortages. So it seems like a time for computer-people solidarity.

  • muusemuuse@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    13 days ago

    Built a new home server last year for literally everything.

    Ryzen 5800xt.

    It’s fine. Does the thing. Fuck off.

  • morto@piefed.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    14 days ago

    This is so true! lol

    At least in my social circle, the ones who use older hardware are either the ones who just do very basic tasks with them, or the ones with advanced tech skills. The average users tend to be so consumerist, expecting that a better hardware will compensate for their lack of skills…

    • ZkhqrD5o@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      13 days ago

      Yeah, but that’s kind of Windows’s fault too, because the Windows system progressively gets slower with every update, the only remedy being completely reinstalling a clean system or better hardware. Planned obsolescence at its finest.

  • BartyDeCanter@lemmy.sdf.org
    link
    fedilink
    arrow-up
    3
    ·
    14 days ago

    At one point I thought like the middle guy, but I didn’t have the money to do it so my computers were made from whatever parts I could scrounge up from dumpster diving, school auctions, or whatever. I was building or upgrading my computer every six months or so with what I found.

    Once I had the money to buy basically whatever computer I wanted, I would build a high end machine and then not bother to upgrade it until I had a friend who needed one. I pass my old one on to them and the cycle repeats.

    My laptops are still random auction finds. My current one came as a pair for $40. I popped in a new SSD and battery in one and have ignored the other. I should see if Haiku supports it well enough to be worth daily driving.

  • khánh@lemmy.zip
    link
    fedilink
    arrow-up
    3
    ·
    13 days ago

    I love my GTX 1050 Ti. Still works fine, albeit with a broken fan. It’s okay though, as long as it still turns on, I’ll find a way.

  • porous_grey_matter@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    13 days ago

    I like to think of myself as a pretty tech competent person but I do tend to buy pretty high end hardware when I do replace a computer every 8+ years or so. I feel like it will likely last me longer and I won’t have to change computers again as soon.

    • MonkderVierte@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      13 days ago

      On the other hand, high-end computing is often binned and factory-overclocked; more likely to bug out and less efficient. Also more expensive.

  • linux_penguin@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    2
    ·
    9 days ago

    My PC turns 8 in August, if it weren’t for the power supply dying in an unfortunate autoclicker accident then all of its original parts would still have been in there after all this time. Switching to Linux made it almost feel new again after all this time having it

  • how_we_burned@lemmy.zip
    link
    fedilink
    arrow-up
    2
    ·
    14 days ago

    I’m using a 7700k from like 2017. I got me a 3070 and 32GB of ram and I can still play 4k @60fps on games like Horizon Zero Dawn, Dead Island 2 Starwars Outlaws (great game) and Cyberpunk (yes fine with DLSS).

    And older stuff like F04, Battletech, HL2, Black Mesa, Fallen order.

    Would I like something more powerful. Sure. Cyberpunk natively rendered would be cool. But it won’t really change my day to day. For that I’d need to spend way more on a fancy monitor which seems kinda moot when I’m already at 4k @ 60hz.