deleted by creator
deleted by creator
The 7950X3D or 9800X3D are both faster (besides the 7800X3D you mentioned).
GPU-wise this is obvious the best AMD has to offer, but an RTX 4090 is obviously faster still. With the typical caveats for NVIDIA on Linux.
I have several components in my network that are at least 6 years old. Is that a problem…?
full mirrors of YT
Yeah…not going to happen.
Sounds cool, I just fail to understand how this takes Cinnamon “out to the real world”.
What’s Ullr?
Sounds about right. There are some valid and good use cases for “AI”, but the majority is just buzzword marketing.
The main thing (by far) degrading a battery is charging cycles. After 7 years with say 1,500 cycles most batteries will have degraded far beyond “80%” (which is always just an estimate from the electronics anyway). Yes, you can help it a bit by limiting charging rate, heat and limit the min/max %, but it’s not going to be a night and day difference. After 7 years with daily use, you’re going to want to swap the battery, if not for capacity reduction then for safety issues.
I think I have a simple function in my .zshrc
file that updates flatpaks and runs dnf
or zypper
depending on what the system uses. This file is synced between machines as part of my dotfiles sync so I don’t have to install anything separate. The interface of most package managers is stable, so I didn’t have to touch the function.
This way I don’t have to deal with a package that’s on a different version in different software repositories (depending on distribution) or manually install and update it.
But that’s just me, I tend to keep it as simple as possible for maximum portability. I also avoid having too many abstraction layers.
That’s mostly down to Teams though (being the bloated web app that it is), and not the underlying operating system.
Technically, wired charging degrades the battery less than wireless charging, mainly because of the excessive heat generated by the latter. The same way slower wired charging generates less heat. Lower and upper charging limits also help (the tighter the better).
But I personally don’t bother with it. In my experience, battery degradation and longevity mostly comes down to the “battery lottery”, comparable to the “silicon lottery” where some CPUs overclock/undervolt better than others. I’ve had phone batteries mostly charged with a slow wired charger degrade earlier and more compared to almost exclusively wireless charging others. No battery is an exact verbatim copy of another one. Heck, I had a 2 month old battery die on me after just ~20 cycles once. It happens.
Sure, on average you might get a bit more life out of your batteries, but in my opinion it’s not worth it.
The way I see it with charging limits is that sure, your battery might degrade 5% more over the span of 2 years when always charging it to 100% (all numbers here are just wild estimates and, again, depend on your individual battery). But when you limit charging to 80% for example, you get 20% less capacity from the get go. Unless of course you know exactly on what days you need 100% charge and plan your charging ahead of time that way.
Something I personally could never be bothered with. I want to use my device without having to think about it. If that means having to swap out the battery one year earlier, then so be it.
When talking about the kernel, Windows actually skipped 3 major versions iirc from the top of my head. Windows 8 was Windows (NT) 6.2, and Windows 10 skipped that version number to, well, 10.
Why when a simple alias will do?
I also experienced less “hiccups” since switching to Linux with KDE but I’d like to know on what combination of hardware and Windows you experienced anywhere close to an average of 1s response time to “any input”.
The article links an article from March '24 talking about the introduction of these devices that contains this part:
The scanner that Adams and police officials introduced during Thursday’s news conference in a lower Manhattan station came from Evolv, a publicly traded company that has been accused of doctoring the results of software testing to make its scanners appear more effective than they are.
So they could never be trusted but were still allowed to proceed.
I expected something more shocking when I read “working with Russia”.
Kagi uses multiple search backends, and of course it needs to forward search terms to these backends. These backends probably can’t trace the searches back to the individual Kagi user though, but Yandex could still analyze search trends for example.
What’s worse is that - unless they use Yandex’ API for free - customers indirectly (and likely unknowingly) support a Russian company with their paid Kagi subscription.
Kagi should at the very least release a statement about this claim.
Happy cake day!
Bitwarden keeps working just fine.
I always hear power efficiency as an argument that ARM chips are magically better at, but Ryzen AI 300 and Intel Core Ultra 200V series seem to be very competitive with Qualcomm’s offering. It’s hard to compare 1:1 as the same chip in different laptops can be configured very differently in terms of TDP and power curves and the efficiency “sweet spots” aren’t the same for all these different chips. Core Ultra 200V is also awaiting more thorough testing, but it seems to be right up there with the Snapdragon.
I honestly found the Snapdragon X very underwhelming after all that marketing of how much better it was than Apple’s M3 and Intel’s and AMD’s offerings. By the time the Snapdragon was actually available in end-user products, AMD’s and Intel’s competing generations were right around the corner and we’ve also seen a vastly improved M4 chip (although only in an iPad so far, so meh). Add to that the issues that you’ll encounter because while Windows’ x86 to ARM translation layer has certainly improved, it’s nowhere near as seamless as what Apple did.
At 8 months old it should be well within warranty. Just get it fixed.