
Stolen from BSKY
That is gold
I’d much rather have a more powerful generic CPU than a less powerful generic CPU with an added NPU.
There are very few people who would benefit from an added NPU, ok I hear you say what about local AI?
Ok, what about it?
Would you trust a commercial local AI tool to not be sharing data?
Would your grandmother be able to install an open source AI tool?
What about having enough RAM for the AI tool to run?
Look at the average computer user, if you are on lemmy, chances are very high that you are far more advanced than the average computer user.
I am talking about those users who don’t run Adblocker, don’t notice the YT ad skip button and who in the past would have installed a minimum of of five toolbars in IE, yet wouldn’t have noticed the reduced view of the actual page.
These people are closer to the average users than any of us.
Why do they need local AI?
Just offer NPUs as PCIe extension cards. Thats how computers used to be and should be. Modular and versatile.
Exactly!
I could even see the cards having ram slots, so you can add dedicated ram to the NPU to remove the need for sharing ram with the system
My understanding from a very brief skim of what Microsoft was doing with Copilot is to take screenshots constantly, run image recognition on it, and then make it searchable as text and have the ability to go back and view those screenshots in a timeline. Basically, adding more search without requiring application-level support.
They may also have other things that they want to do, but that was at least one.
EDIT: They specifically called that feature “Recall”, and it was apparently the “flagship” feature of Copilot.
an added NPU
cmiiw but I don’t think NPUs are meant to be used on general-purpose personal computers. A GPU makes more sense.
NPUs are meant for specialised equipment e.g. object detection in a camera (not the personal-use kind)
Not the position Dell is taking, but I’ve been skeptical that building AI hardware directly into specifically laptops is a great idea unless people have a very concrete goal, like text-to-speech, and existing models to run on it, probably specialized ones. This is not to diminish AI compute elsewhere.
Several reasons.
-
Models for many useful things have been getting larger, and you have a bounded amount of memory in those laptops, which, at the moment, generally can’t be upgraded (though maybe CAMM2 will improve the situation, move back away from soldered memory). Historically, most users did not upgrade memory in their laptop, even if they could. Just throwing the compute hardware there in the expectation that models will come is a bet on the size of the models that people might want to use not getting a whole lot larger. This is especially true for the next year or two, since we expect high memory prices, and people probably being priced out of sticking very large amounts of memory in laptops.
-
Heat and power. The laptop form factor exists to be portable. They are not great at dissipating heat, and unless they’re plugged into wall power, they have sharp constraints on how much power they can usefully use.
-
The parallel compute field is rapidly evolving. People are probably not going to throw out and replace their laptops on a regular basis to keep up with AI stuff (much as laptop vendors might be enthusiastic about this).
I think that a more-likely outcome, if people want local, generalized AI stuff on laptops, is that someone sells an eGPU-like box that plugs into power and into a USB port or via some wireless protocol to the laptop, and the laptop uses it as an AI accelerator. That box can be replaced or upgraded independently of the laptop itself.
When I do generative AI stuff on my laptop, for the applications I use, the bandwidth that I need to the compute box is very low, and latency requirements are very relaxed. I presently remotely use a Framework Desktop as a compute box, and can happily generate images or text or whatever over the cell network without problems. If I really wanted disconnected operation, I’d haul the box along with me.
EDIT: I’d also add that all of this is also true for smartphones, which have the same constraints, and harder limitations on heat, power, and space. You can hook one up to an AI accelerator box via wired or wireless link if you want local compute, but it’s going to be much more difficult to deal with the limitations inherent to the phone form factor and do a lot of compute on the phone itself.
EDIT2: If you use a high-bandwidth link to such a local, external box, bonus: you also potentially get substantially-increased and upgradeable graphical capabilities on the laptop or smartphone if you can use such a box as an eGPU, something where having low-latency compute available is actually quite useful.
I’m not that concerned with the hardware limitations. Nobody is going to run a full-blown LLM on their laptop, running one on a desktop would already require building a PC with AI in mind. What you’re going to see being used locally are going smaller models (something like 7B using INT8 or INT4). Factor in the efficiency of an NPU and you could get by with 16GB of memory (especially if the models are used in INT4) with little extra power draw and heat. The only hardware concern would be the technological advancement speed of NPUs, but just don’t be an early adopter and you’ll probably be fine.
But this is where Dells point comes in. Why should the consumer care? What benefits do consumers get by running a model locally? Outside of privacy and security reasons you’re simply going to get a better result by using one of the online AI services because you’d be using a proper model instead of the cheap one that runs with limited hardware. And even for the privacy and security minded people you can just build your own AI server (maybe not today but when hardware prices get back to normal) that you run from home and then expose that to your laptop or smartphone. For consumers to desire running a local model (actually locally and not in a selfhosting kind of way) there would have to be some problem that the local model solve that the over the internet solution can’t solve. So far such a problem doesn’t exist today and there doesn’t seem to be a suitable problem on the horizon either.
Dell is keeping their foot in the door by still implementing NPUs into their laptops, so if by some miracle some magical problem is found that AI solves they’re ready, but they realize that NPUs are not something they can actually use as a selling point because as it stands, NPUs solve no problems because there’s no benefit to running small models locally.
More to the point, the casual consumer isn’t going to dig into the nitty gritty of running models locally and not a single major player is eager to help them do it (they all want to lock the users into their datacenters and subscription opportunities).
On the Dell keeping NPUs in their laptops, they don’t really have much of a choice if they want modern processors, Intel and AMD are all-in on it still.
Setting up a local model was specifically about people who take privacy and security seriously because that often requires sacrificing convenience, which in this case would be having to build a suitable server and learning the necessary know-how of setting up your own local model. Casual consumers don’t really think about privacy so they’re going to go with the most convenient option, which is whatever service the major players will provide.
As for Dell keeping the NPUs I forgot they’re going to be bundled with processors.
My general point is that discussing the intricacies of potential local AI model usage is way over the head of the people that would even in theory care about the facile “AI PC” marketing message. Since no one is making it trivial for the casual user to actually do anything with those NPUs, then it’s all a moot point for this sort of marketing. Even if there were an enthusiast market that would use those embedded NPUs without a distinct more capable infrastructure, they wouldn’t be swayed/satisfied with just ‘AI PC’ or ‘Copilot+’, they’d want to know specs rather than a boolean yes/no for ‘AI’.
Phones have already came with ai processors for a long time, specifically for speech recognition and camera features, its not advertised because its from before the bubble started
deleted by creator
Where in their comment does it say “exactly zero users”? Oh right, it doesn’t
-
Doesn’t confuse me, just pisses me off trying to do things I don’t need or want done. Creates problems to find solutions to
Can the NPU at least stand in as a GPU in case you need it?
Nope. Don’t need it
I actually do care about AI PCs. I care in the sense that it is something I want to actively avoid.
For me at least, AI reminds me too much of that thrice cursed MS Word paperclip. I did not want it then and I do not want it now.
Also, adding ‘personality’ to pieces of software is cringy at best and downright creepy at worst.
Forget about the personality for a minute. They have a different thing in common. Uselessness. I tried AI for a bunch of general use cases and it almost always fails to satisfy. Either it just can’t do the task in the first place or it makes mistakes that then cost too much time to fix.
There are exceptions and specialized models will have their use but it’s not the Swiss army knife tool AI companies are promising.
AI hardware is a sales pitch without a clear product. Consumers have no clue why they would want to buy something with AI on it.
For most consumers AI is a webpage that kids cheat on homework or adults attempt to cheat at work with. It makes ugly fake pictures with all sorts of weird errors. Its also the annoying as fuck answering services that you have to yell at 4 or 5 times to get to a real person.
Why would an AI PC be desirable?
Why not just leave it alone inside a browser tab? If I want AI, and I use it quite a lot, I will go into their website. Don’t force it system wide, just sucks
They want their greasy tendrils all up in your PC’s guts. Every bit of info flowing in your system can be monetized. All they care about is money and dominance and their “AI” in everyone’s devices is their wet dream.
Cancer is preferable to tech bros as cancer doesn’t know its killing the host. Tech bros know full well their actions are killing the planet and its inhabitants. Their actions are willfully vile and toxic; completely at odds with the needs of humanity.
Don’t expect them to ever do the right thing for anyone but themselves.
This is pretty much a “all Tech companies have to jump on the AI hype train” pressure on publicly traded companies and those who need lots of investor money, and little if at all customer pressure.
All investors want their money to be in the same place as those who invested in Google before it made it big, and the AI hype promises exactly that to the “winners” of the AI race.
Customer needs and demands are well below secondary to investor pressure, especially for companies which have dominant market positions (so general customers have no decent alternatives) and startups whose entire business model is AI.
Holy crap that Recall app that “works by taking screenshots” sounds like such a waste of resources. How often would you even need that?
Virtually everything described in this article already exists in some way…
It’s such a stupid approach to the stated problem that I just assumed it was actually meant for something else and the stated problem was to justify it. And made the decision to never use win 11 on a personal machine based on this “feature”.
So, it’s not really a problem I’ve run into, but I’ve met a lot of people who have difficulty on Windows understanding where they’ve saved something, but do remember that they’ve worked on or looked at it at some point in the past.
My own suspicion is that part of this problem stems from the fact that back in the day, DOS had a not-incredibly-aimed-at-non-technical-users filesystem layout, and Windows tried to avoid this by hiding that and stacking an increasingly number of “virtual” interfaces on top of things that didn’t just show one the filesystem, whether it be the Start menu or Windows Explorer and file dialogs having a variety of things other than just the filesystem to navigate around. The result is that you have had Microsoft banging away for much of the lifetime of Windows trying to add more ways to access files, most of which increase the difficulty of actually understanding what is going on fully through the extra layers. But regardless of why, some users do have trouble with it.
So if you can just provide a search that can summon up that document where they were working on that had a picture of giraffes by typing “giraffe” into some search field, maybe that’ll do it.
As time goes by I’m finding a place for AI.
-
I use it for information searches, but only in cases where I know the information exists and there is an actual answer. Like history questions or asking for nuanced definitions of words and concepts.
-
I use it to manipulate documents. I have a personal pet peeve about the format of most recipes for example. Recipes always list the ingredient amounts in a table at the top, but then down in the steps they just say “add the salt” or “mix in the flour.” Then I have to look up at the chart and find the amount of salt/flour, and then I lose my place in the steps and have to find it again. I just have AI throw out the chart and integrate the amounts into the steps: “mix in 2 cups of flour”. I can have it shorten the instructions too and break them into easier to read bullet points. I also ask it to make ingredient substitutions and other modifications. The other day I gave it a bread recipe and asked it to introduce a cold-proofing step and reformat everything the way I like. It did great.
-
Learning interactively. When I need to absorb a new skill or topic I sometimes do it conversationally with AI. Yes I can find articles and videos but then I am stuck with the information they lay out and the pace and order in which they do it. With AI you can stop and ask clarifying questions, or have it skip over the parts you already know. I find this is way faster than laborious googling. However only trust it for very straightforward topics. Like “explain the different kinds of welding and what they are for.” I wouldn’t trust it for more nuanced topics where perspective and opinion come into it. And I’ve leaned that it isn’t great at topics where there isn’t enough information out there. Like very niche questions about the meta of a certain video game that’s only been out a month.
-
Speech to text and summarization. AI records all my Zoom meetings for work and gives summaries of what was discussed and next steps. This is always better than nothing. I’m also impressed with how it seems to understand how to discard idle chit chat and only record actual work content. At most it says “the meeting began with coworkers exchanging details from their respective weekends.”
This kind of hard-and-fast summarization and manipulation of factual text is much easier with AI. Doing my job for me? No. Hovering over my entire computer? No. Writing my emails for me? Fuck off.
The takeaway is that specific tools I can go to when I need them, for point-specific needs, is all I want. I don’t need or what a hovering AI around all the time, and I don’t want whatever tripe Dell can come up with when I can get the best latest models direct from the leading players.
Assuming you keep a critical eye on the results, surely AI can be used for some meaningful things like the ways you found - thanks for sharing them. But i could bet that most people will be stuck at the BS generator level with its poisonous effects on them and the society at large.
I agree. I share my use cases mostly to put the critical thinking behind them on display. I’m sure the crowd here is very savvy. But in the general public I agree that many if not most people would be completely seduced by the obsequious & confident tone of the robot. It can do so many things that it becomes tempting to rely on it. You wish it worked better than it did, and if you let yourself get lazy, you can easily slip into trusting it too much.
-
I want to run LLMs locally, or things like TTS or STT locally so it’s nice but there’s no real support rn
Most people won’t care nor use it
LLMs are best used when it’s a user choice, not a platform obligation
I guess an NPU is better of being a PCIe peripheral then?
And it can then have their specialised RAM too.Sorry, I’m not a hardware expert at all
When you’re talking about the PCIe peripheral, you’re talking about a separate dedicated graphics card or something else?
I guess the main point of NPUs are that they are tiny and built in
When you’re talking about the PCIe peripheral, you’re talking about a separate dedicated graphics card or something else?
Yes, similar to what a PCIe Graphics Card does.
A PCIe slot is the slot in a desktop motherboard that lets you fit various things like networking (ethernet, Wi-Fi and even RTC specialised stuff) cards, sound cards, graphics cards, SATA/SAS adapters, USB adapters and all other kinds of stuff.I guess the main point of NPUs are that they are tiny and built in
GPUs are also available built-in. Some of them are even tiny.
Go 11-12 years back in time and you’ll see video processing units embedded into the Motherboard, instead of in the CPU package.
Eventually some people will want more powerful NPUs with better suited RAM for neural workloads (GPUs have their own type of RAM too), not care about the NPU in the CPU package and will feel like they are uselessly paying for it. Others will not require an NPU and will feel like they are uselessly paying for it.So, much better to have NPUs be made separately in different tiers, similar to what is done with GPUs rn.
And even external (PCIe) Graphics Cards can be thin and light instead of being a fat package. It’s usually just the (i) extra I/O ports and (ii) the cooling fins+fans that make them fat.
Thanks for your answer
So, much better to have NPUs be made separately in different tiers, similar to what is done with GPUs rn.
Overall yea, but built-in graphics are remarkably efficient and they have the added benefit to be here even if you didn’t plan that use initially. I’m glad to be able to play video games on my laptop that was meant to be use for work only
Similarly, I had no interest in getting an NPU for this laptop but I found some use to it (well, when it’ll finally support what I want to do)
Manufacturers will never include a niche option, or will overprice it. Built in allows to get that directly
We should have been given a choice whether we want to use it or not, them trying to force it on us is why they are getting so much pushback, let those that want to use it use it and those that don’t want it to be given the option to turn it off, it’s not rocket science, but they are constantly going:
Tech CEOs - this is our AI you have to use it! Consumers - but i don’t want to! Tech CEOs - FUCKING USE IT!!!
and then they are whining "WAAAHHHHH PEOPLE ARE MEANIES THAT DON’T LIKE OUR AI THAT DOES NOTHING TO IMPROVE THEIR LIVES AND WILL MAKE US MORE MONEY BY LETTING US PUT TARGETED ADS INTO THEIR EYEBALLS WWWWAAAAAAAAAAHHHHHHHH!!!
But think about the shareholders, how are they going to pay off their trillion dollar debts building data centers? You need to use it, replace all aspects of your life with AI, then they can squeeze you!
There was a scene in a kinda shitty movie The Scorpion King starring The Rock where he was buried and just his head was sticking out and huge red ants were coming to eat his face, i mean i would care, that they would scream too much and scare the ants, poor things are hungry, but they are so disgusting that the ants would probably walk past them thinking it’s camel shit…
Confuses them?
I think it’s quite possible to become confused if you’re used to software that, bugs aside, behaves pretty much completely predictably, then get a feature marketed as “intelligence” which suddenly gives you unpredictable and sometimes incorrect results. I‘d definitely be confused if the reliable tools I do my work with suddenly developed a mind of their own.
Well, that certainly would confuse users, yes.
Have you recently vibe editted a Microsoft Copilot Excel Sheet on your AI PC (but actually in the cloud)?
🤮
It’s just a softer thing to say than ‘a lot of people hate AI and it’s alienating potential customers’. They can’t come out and say that out loud, they don’t want to piss off Microsoft too much and they aren’t going to try to do NPU-free systems (it’s not really possible). They aren’t going to do anything to ‘fight back’ against the AI that people hate (they can’t), so their best explanation as to why they pull back from a toxic brand strategy is that ‘people just don’t care’ rather than ‘people hate this thing that we are going to keep feeding’.
But if they need to rationalize the perspective, an “AI” PC does nothing to change the common users experience with the AI things they know, does not change ChatGPT or Opus or anything similar, that stuff is entirely online. So for the common user, all ‘AI’ PC means is a few Windows gimmicks that people either don’t care about or actively complained about (Recall providing yet another way for sensitive data to get compromised).
In terms of “AI” as a brand value, the ones most bullish about AI are executives that like the idea of firing a punch of people and incidently they actually want to buy fewer PCs as a result. So even as you can find AI enthusiastic people, they still don’t want AI PCs.
For most people, their AI experience has been:
- News stories talking about companies laying off thousands or planning to lay off thousands for AI, AI is the enemy
- News stories talking about some of those companies having to rehire those people because AI fell over, AI is crap
- Their feeds being flooded with AI slop and deepfakes, AI is annoying
- Their google searches now having a result up top that, at best, is about the same as clicking the top non-sponsored link, except that it frequently totally botches information, AI is kind of pointless
For those that have actually positive AI experience, they already know it has nothing to do with whether the PC is ‘AI’ or not. So it’s just a brand liability, not a value.
They said they still adding all of it. They are adding ai. Just not talking about it. Which is probably correct 😂
The ‘quiet part out loud’ isn’t being used how it should be
Microsoft appears to genuinely believe that AI features are what customers want even if they don’t directly ask for it, nobody asked for Google’s AI feature but it’s hugely popular, people freakin love it (except here on Lemmy of course where it’s the devil), Google see that users are staying on their site longer and not clicking through to sites as much which is a win for them and a win for users
A real quiet part out loud would be ‘Microsoft knows its users don’t like its AI features but don’t care because Windows isn’t a money maker for them compared to Microsoft Copilot 365’
The real ‘quiet part’ would be that they are avoiding it because a large number of people hate ‘AI’. To say they are ‘confused’ is still keeping the quiet part quiet…
Is that a win for Google, though? They make most of their money from AdSense, because websites want to display ads. If people aren’t clicking through to websites from their search results, that seems like fewer opportunities to display ads, reducing the viability of AdSense.
seems like so far things are going well, businesses spend more on ads on google 🙃
Google, for now, is still laughing its way to the bank. The search giant is putting advertisers’ ads within or directly above and below AI Overviews themselves. As Google CEO Sundar Pichai said in an interview last year, “If you put content and links within AI Overviews, they get higher clickthrough rates than if you put it outside of AI Overviews.” Organic links are pushed down.
Even in 2024, market research company SparkToro’s study of searchers found “almost 30 percent of all clicks go to platforms Google owns.” For every 1,000 Google searches in the United States, 360 clicks go to a non-Google-owned, non-Google-ad-paying property. With the rise of AI Overview, Google’s share can only have grown, while everyone else’s has shrunk.
https://www.theregister.com/2025/07/29/opinion_column_google_ai_ads/
I believe maps and youtube were money losers for google for a very long time but the goal was to keep people within the google ecosystem where they can extract money from users in other ways
i don’t see AI as confusing, tbh. if anything it is WAY too easy to use, which means it is WAY too easy to come up with stupid shit no one wants. if only people weren’t morons
Well, first Dell’s use of ‘confused’ is mainly a way to walk “away” from AI as a marketing strategy without having to walk it “back” (they can’t walk it back: Microsoft will keep Copiloting it up, the processor comparies will keep bundling NPUs, and the consumer exposure to AI will continue to have nothing to do with any of the ‘AI PC’ or not). So ‘confused’ is a way to rationalize the absence of ‘AI PC’ in their marketing strategy without having to actually change what they are doing.
But to the extent ‘confusing’ may apply, it’s less about ‘AI’ and more about ‘AI PC’. What about this ‘AI PC’ would impact your usage with AI, for most people the answer is ‘not at all’, since mostly it’s over the internet. So for the layperson, an ‘AI PC’ just enables a few niche Windows features no one cares about. Everything pushing around the ‘AI’ craze is well away from actually running on the end user devices.













