![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
It’s a common misconception but ARM isn’t inherently better at battery life than x86 though. It’s more that Qualcomm’s designs are as compared to the companies on the market that produce x86 hardware.
It’s a common misconception but ARM isn’t inherently better at battery life than x86 though. It’s more that Qualcomm’s designs are as compared to the companies on the market that produce x86 hardware.
Is this really a surprise to anyone outside of the AI hype machine?
Wordpad has been deprecated
There are legitimate uses of AI in certain fields like medical research and 3D reconstruction that aren’t just a scam. However, most of these are not consumer facing and the average person won’t really hear about them.
It’s unfortunate that what you said is very true on the consumer side of things…
I don’t think you’re talking about the Outlook I am talking about because I’m talking about the Outlook Mail and Calendar apps, not the Office ones. And that’s fine and dandy about proprietary software and all but frankly I haven’t really seen any non-proprietary mail apps that look aesthetically pleasing. But that’s besides the point, it’s a matter of personal preference when it comes to visuals after all.
You don’t have to come here and assume you know everything about me simply from my choice of OS and invalidating my experiences with personal attacks no less. If your rant here is trying to convince me or anyone else who is reading that we should abandon Windows because of the reasons, you have stated, you are failing terribly I’m afraid. Not everyone has such high standards as you have and it’s frankly patronizing for you to think that I or anyone else have not considered these options when it affects our workflow. If anything, people reading this are gonna be dissuaded of Linux because if this is the kind of tone and experience we’re going to get when we try to, well, it’s a lot less stressful staying away from Linux.
It’s somewhat concerning that you have such a strong obsession over the topic that you would go and whether intentionally or unintentionally offend people and I hope that you are a much more pleasant person to converse with outside of this topic or even this site.
I’d also like to add, nowhere did I ever mention using laptops. All my experiences are with desktops that I had a hand in building from scratch. So I’m not sure what you’re even getting at with those assumptions.
Have a good day sir.
I’m not sure why that is so hard to believe. I use Ubuntu and Windows at work daily and Windows at home. I know the challenges of both and Windows at worst just annoys me with them forcing the new Outlook app on me. Everything else just works. Plays games amazingly, Visual Studio is uncontested, syncs nicely with my Android phone and I have no driver issues whatsoever. Don’t have to go diving into the command line to change settings either.
The only time Linux works perfectly for me is on my Steam Deck and that’s entirely because Valve has handled all the driver issues for us on that hardware.
I think you’re missing the point here. It’s more that people couldn’t even be bothered to search up how to do something (that takes seconds) that they want to do first, and instead just rely on someone they think is an expert without putting in any effort at all.
Your examples don’t really make sense either as a lot of these are paid professions for larger tasks that most people simply don’t want to do. There’s a huge difference in searching online “how to install a Firefox extension” vs “how to do an weave”, etc.
End of the day, the average person doesn’t care and if they truly did they’d have the initiative to have just researched it and done it on their own.
Bringing it back to the whole thing about Linux, can you imagine how frustrating it would be to have to help debug a user’s Linux installation when they already need help with installing a browser add on? I work with tech and Linux on a daily basis and I already find it frustrating doing it for myself (fuck Nvidia drivers). No way am I gonna recommend it to someone else.
Yea same here. In all my years using a computer, I’ve never seen such a thing aside from “preinstalled” apps. Is this a regional thing?
Now if they would only release a Steam Controller 2…
Android phone makers are shipping on device LLMs?
Do people actually want these?
Ads will probably stop me from watching YouTube completely. The huge surge of ads at some point was what stopped me from using Instagram.
I used to have a Pebble too but I’ve long since given up on any hope of the market building something similar that looks as cool as the Pebble was. What exactly do you think is awful about Samsung’s Wear OS? I tried both the Pixel Watch and the Galaxy Watch and I greatly prefer Samsung’s.
Well for the current generation consoles they’re both x86-64 CPUs with only a single set of GDDR6 memory shared across the CPU and GPU so I’m not sure if you have such a penalty anymore
It’s not that unified memory can’t be created, but it’s not the architecture of a PC, where peripheral cards communicate over the PCI bus, with great penalties to touch RAM.
Are there any tests showing the difference in memory access of x86-64 CPUs with iGPUs compared to ARM chips?
Do you have any sources for this? Can’t seem to find anything specific describing the behaviour. It’s quite surprising to me since the Xbox and PS5 uses unified memory on x86-64 and would be strange if it is extremely slow for such a use case.
Thanks for the links, they’re really informative. That said, it doesn’t seem to be entirely certain that the extra work done by the x86 arch would incur a comparatively huge difference in energy consumption. Granted, that isn’t really the point of the article. I would love to hear from someone who’s more well versed in CPU design on the impact of it’s memory model. The paper is more interesting with regards to performance but I don’t find it very conclusive since it’s comparing ARM vs TSO on an ARM processor. It does link this paper which seems more relevant to our discussion but a shame that it’s paywalled.
Do x86 CPUs with iGPUs not already use unified memory? I’m not exactly sure what you mean but are you referring to the overhead of having to do data copying over from CPU to GPU memory on discrete graphics cards when performing GPU calculations?
Their primary money makers are what’s stopping them I reckon. Apple’s move to ARM is because they already had a ton of experience with building their own in house processors for their mobile devices and ARM licenses stock chip designs, making it easier for other companies to come up with their own custom chips whereas there really isn’t any equivalent for x86-64. There were some disagreements between Intel and AMD over patents on the x86 instruction set too.
Do you mind elaborating what is it about the difference on their memory models that makes a difference?
There’s nothing stopping x86-64 processors from being power efficient. This article is pretty technical but does a really good explanation of why that’s the case: https://chipsandcheese.com/2024/03/27/why-x86-doesnt-need-to-die/
It’s just that traditionally Intel and AMD earn most of their money from the server and enterprise sectors where high performance is more important than super low power usage. And even with that, AMD’s Z1 Extreme also gets within striking distance of the M3 at a similar power draw. It also helps that Apple is generally one node ahead.
This article also does a really good in depth explanation about the topic although it does get a lot more technical but if you’re interested, it’s a really good read.