So, if the AI bubble pops, it’ll be a great time to build a PC
Like there won’t be some other hype to immediately take it’s place. Just like Bitcoin GPU prices never collapsed because it went right into AI hype.
The next hype lined up is quantum
Fortunately they won’t need most normal PC hardware for this. At least not CPU’s and GPUs AFAIK.
“Quantum emulation” all the hype, 1/1000th the efficiency, 1000X the excuses to sell hardware
Im seeing a pattern here (capitalism)
AI messed up GPU prices even before AI was really a thing. When everyone was caught up in the Bitcoin hype, Nvidia already focused completely on AI instead of banking on the crypto hype, neglecting consumer GPUs. And we still feel that today.
When the AI bubble pops we’ll all be too broke to buy any PC anything.
The Buffett Index, America’s total stock valuation vs. GDP, is at 200%. It was around 130% in 1929, 2000 and 2007. Guess what? Chicken butts. (is what we’ll all be eating)
There are a lot of “refurbished” drives from when the Chia bubble popped (a useless shitcoin that wasted HDD space with garbage data as a proof of cryptographic work)
THIS is what I’m looking forward to. I’m guessing it’ll start sometime next year, so shortly after Christmas '26 will be the optimal time -at least that’s my long-term plan.
Wish I could confidently say that it is going to pop soon, but I am not sure the current rebound is the bull trap. Maybe the correction was just a small blip in a huge bubble…
Be a great time to set up RAID storage systems (or whatever I’m not that techie) mmmmmmm I cannot waaaaait to have something resembling a backup.
Required message that “raid is not a backup [solution]”, it’s an uptime and recovery system.
My primary server uses raid along with snapshots, full local backups, and off-site backups for critical data to two different cloud providers on different continents.
My second server backups images to the primary. My vps also backups to the primary. Both get the raid and snapshot treatment, and local, but not cloud. Gaming servers, boinc, and home assistant aren’t ‘critical’ :p
I’ll pray for this outcome, I need something that can actually run Unreal(requirements) Engine 5 games
Nope. China vs. Taiwan at the horizon. They really want that island back. It won’t be good for world peace and really bad for EUV-lithography (TSMC).
China vs. Europe and China vs. US are topics, too. Hopefully limited to economic pressure.
There’s always the second hand market…
Oh for fucks sake… I wanted to expand my NAS… Crypto was (is still responsible) for the shit state of the GPU Market and now it’s the next scam.
i just bought an 8tb drive a couple weeks ago with zero issue
Fuck you, fuck you, fuck you, fuck you, fuck you, fuck you, fuck you, fuck you, fuck you, fuck you, fuck you, fuck you, fuck you, fuck you, fuck you, fuck you
Remember a year or so ago when they all spun down production so they could charge more money for drives? I do.
What else are these data centers going to hoard?
Jobs, GPUs, water, hard drives
Electricity
Ironic Microsoft doesn’t have enough electricity to power their hoarded gpus.

Land.
AI crap. Infesting everything. Search of all kinds, photo management, telephone menus, who knows what else. And it does none of it well.
I don’t have issues with local AIs, for things like searching your local immich instance, or controlling your local Home Assistant devices. That photo of a bird you took 3’ish years ago? Yeah, you can find it in like three seconds with a local AI search. Want to turn the lights on with a voice request? AI is one of the easiest ways for a layman to handle the language processing side of things. All of that is a drop in the ocean.
But corporations have been trying to cram it into everything, even when it’s not a good fit for what they want to do. And so far, their solution to making it fit hasn’t been to rethink their usage and consider whether or not it will actually improve a product. Instead, their approach has simply been to build more and bigger data centers, to throw increasing amounts of processing power at the problem.
The technology itself isn’t inherently harmful on the small scale. But it has followed the same pattern as climate change. Individual consumers are blamed for climate change, and are consistently urged to change their consumption habits… When it’s actually a handful of corporations producing the vast majority of greenhouse emissions. Even if every single person drastically changed their emission habits, it would barely make a dent in the overall production. It was all because of massive astroturfed PR campaigns to shift the blame away from those companies and onto individuals. And we’ve seen that same thing happen with AI, where individual users have been blamed for using AI, instead of the massive corporations.
I think your sentiment and the back end requirements of AI is a big downfall of it, as while your sentiment has validity in many public facing deployments of it there are some things it is actually succeeding at. I speak from experience having used it for several specific use cases which it excels at, but you and others probably don’t have time nor care that this is true. And again marketing idiots out weight the deliberate approach that engineers and others might want, much less the economy might need.
cases which it excels at
Uhuh
Like how my colleagues often come to me saying they fixed it with Cursor and when I check the UI where the bug was, the page doesn’t even load at all now.
I just had someone tell me they did something with AI and when I checked they didn’t even get right the very basic thing of coding around the right controller names. The fucking names were wrong. They didn’t even check the feature, they just shipped, called it fixed, and told me Cursor figured it out really quickly.
I’m tired of this. I’m REALLY tired of this, man…
I posted this as a perspective on AI that is not given by AI, nor by someone who believes it will stay this way, but nor am I promoting it. I believe it’s more nuanced that just being crap, although it is taking over many things in life. I have used it, I know how to use it for good (keeping it private, local, and to help teach reasoning as well as do the thing that we need done (like dishes, bills, and other bullshit). I’m fully aware it’s a bubble (14 billion to 1.4 trillion for OpenAI alone), dislike it and hate the energy waste. You all just seem to want to keep up the ignorant web user stereotype.
Have fun down voting something you don’t really understand.literally no one is saying “we need to kill all pursuits of ai because it’s bad at organizing my photos”
RAID5, don’t fail me now!
As a PSA: depending on your needs (bcs I see a lot of simple homelabs RAIDing), consider second server or even second location simple snapshot backup vs same-machine duplication.
Like, which would you want to set up first depending on your specific risks profile.The later is like like a proper backup, the former more of a low-downtime strat (that most smol homelabs can do without - “just wait a day, mom”)
Eg as the most basic example - instead of local duplication you could have a small PC (even like some old Pi), big HDD, and rsync once a day.
(Ofc, with duplicated serves, finances permitting, you can also provide backup services which helps that downtime issue. And by finances I mean the rest of your server group stuff, which in my cases is mostly HDD cost anyways.)
Got it!*
*puts a Timeshift partition on the same RAID array
Lol (but ppl do that).
TL;DR
QLC drives have fewer write-cycles than TLC and if their data is not refreshed periodically (which their controllers will automatically do when powered) the data in them gets corrupted faster.
In other words, under heavy write usage they will last less time and at the other end when used for long term storage of data, they need to be powered much more frequently merelly to refresh the stored states (by reading and writting back the data).
So moving to QLC in cloud application comes with mid and long terms costs in terms of power usage and, more importantly, drive end-of-life and replacement.
–
Quad Level Cell SSD technology stores 4 bits per cell - hence 16 levels - whilst TLC (Triple Level Cell) stores 3 bits - hence 8 levels - so the voltage difference between levels is half as much, and so is the margin between levels.
Everything deep down is analog, so the digital circuitry actually stores analog values on the cells at then reads them back and converts them to digital. When reading that analog value, the digital circuit has to decide to which digital value that analog value actual maps to, which it does by basically accepting any analog value within a certain range aroun the mathematically perfect value for that digital state.
(A simple example: in a 3.3V data line, when the I/O pin of a microcontroller reads the voltage it will decide for example that anything below 1.2V is a digital LOW (i.e. a zero), anything above 2.1V is a HIGH (a one) and anything in between is an erroneous value - i.e. no signal or a corrupted signal - this by the way is why if you make the line between a sender and a receiver digital chip too long, many meters, or change the signals in them too fast, hundreds of MHz+, without any special techniques to preserve signal integrity, the receiver will mainly read garbage)
So the more digital levels in a single cell the narrower the margin, the more likely that due to the natural decay over time of the stored signal or due cell damage from repeat writes, the analog value the digital circuitry reads from it be too far away from the stored digital level and be at best marked as erroneous or at worse be at a different level and thus yield a different digital value.
All this to say that QLC has less endurance (i.e. after fewer writes the damage to the cells from use causes that what is read is not the same value as what was written) and it also has less retention (i.e. if the cell is not powered, the signal decay will more quickly cause stored values to end up at a different level than when written).
Now, whilst for powered systems the retention problem is not much of an issue for cloud storage (when powered, the system automatically goes through each cell, reading its value and writting it back to refresh what’s stored there back to the mathematically perfect analog value) with just a slightly higher consumption over time for data that’s mainly read only (for flash memory, writting uses way more power than reading), the endurance problem is much worse for QLC because the cells will age twice as fast over TLC for data that is frequently written (wear-leveling exists to spreads this effect over all cells thus giving higher overall endurance, but wear-leveling is also in there for TLC so it does not improve the endurance of QLC).
Oh this must be for
trainingstolen dataWhen it all comes crashing down, at least the Internet Archive could have easy & cheap access to it all. I trust them to handle it more responsibly than the AI bros.
There is nothing short of non-fascistic government-forced bankcrupcy sell off that will actually make it cheap, so… I feel it is a pipe dream
Ram shortage too. But most users should avoid QLC.
I noticed the price doubled since the last time I ordered a drive, like a year ago.
I haven’t been following trends but I just looked on newegg, and HDD and SSD prices look about where they were before, or maybe a bit higher. It’s nothing like the Chia craze where every drive of any kind was snapped up by crypto miners. Or the simliar thing further back where a flood in Thailand(?) clobbered a factory so there were big shortages and price spikes. Right now you can get drives if you’re willing to pay for them. There just hasn’t been the usual downward price trend.
I’ve been buying refurbished drives from ServerPartDeals and looking at my invoice from March versus now the price of the same drive has gone up 10% from $180 to $198 (Ultrastar HC530 14TB)
Keep in mind Black Friday is also coming, and a lot of vendors like to raise their prices a bit before so they can claim a sale on the day
Yep. I thought $100 for 8TB was expensive when I checked earlier this year and now the cheapest one is $140. Fuck 😂
$100 is actually a great deal. I had a bunch of shucked 8TB WD EasyStore in my media server previously, bought between 2018-2023, and I think the best deal I ever saw during that time was around that price. Around $12/TB was always my benchmark for a great deal on some drives.
I was going to say hurricane, but you’re right, it was flooding in 2011.
Yeah, hurricanes aren’t really a thing in that region, they have monsoons instead.
Why HDD’s?
I thought LLMs ran on a fuckload of VRAM and thats pretty much it. So the GPU market was the main affected?
Stolen data
Yet people will scalp, buy more products and defend it by saying “but I need this”. There used to be a time, where consumers would drive the industry, not anymore. And this leads to companies doing what ever they want.
People just accept for no reason.My theory is that we’re hitting that limit again, most people don’t need more than a terabyte or two tops, so companies adapt.
Kind of shite being on the “need a bit more” side because I feel they’ll do everything to bleed us dry. My 2022 4TB drive was less than 100€, today 3+ years later, it’s 139€.











