7.8 C
New York
Wednesday, December 10, 2025

Buy now

spot_img

Laptops are compromising for AI, and we have nothing to show for it

The best laptops and laptop brands are all-in on AI. Even compared to a year ago, the list of the best laptops has been infested with NPUs and a new era of processors, all of which promise to integrate AI into every aspect of our day-to-day lives. Even after a couple of years at this AI revolution, though, there isn’t much to show for it. We now have Qualcomm’s long-awaited Snapdragon X Elite chips in Copilot+ laptops, and AMD has thrown its hat in the ring with Ryzen AI 300 chips. Eventually, we’ll also have Intel Lunar Lake CPUs. The more we see these processors, however, it becomes clear that they aren’t built for an AI future rather than for the needs of today — and you often can’t have it both ways. One important aspect of chip design that doesn’t get talked about enough is space. If you browse hardware forums and enthusiast-focused websites, you’ll already know how important space is. But for everyone else, it’s not something you normally think about. Companies like AMD and Intel could make massive chips with a ton of computing horsepower — and a ton of power and thermal demands, but that’s a different conversation — but they don’t. A lot of the art of chip design comes down to how much power you can cram into a certain space. It’s important to understand that because adding hardware onto a chip isn’t free — it means taking space away from something else. For instance, you can see an annotated die shot of a Ryzen AI 300 CPU below. And in the upper right, you can see how much space the XDNA NPU is taking up. It’s the smallest of the three processors on the chip — speculation pins it at around 14mm2 — but it still takes up a lot of space. AMD could use that space for more cores or, more likely, additional L3 Infinity Cache for the GPU. This isn’t to single out AMD or to say that Ryzen AI 300 CPUs perform poorly. They don’t, as you can read in our Asus Zenbook S 16 review. AMD, Intel, and Qualcomm are constantly making trade-offs in design to fit everything they need onto the chip, and it’s not as simple as just throwing more cache on and calling it a day. Pulling one lever moves countless other values, and all of those need to be brought into balance. But it is a good demonstration that adding an NPU to a chip isn’t something designers can just do without making compromises elsewhere. At the moment, these NPUs are largely useless, as well. Even apps that are accelerated by AI would rather take the horsepower of the integrated GPU, and if you have a discrete GPU, it’s leagues faster than the NPU is. There are some use cases for an NPU, but for the vast majority of people, the NPU really just functions to provide (slightly) better background blur. Ryzen AI 300 is the only example we have now, but Intel’s upcoming Lunar Lake chips will also be caught in a similar situation. Both AMD and Intel are gunning for Microsoft’s stamp of approval in Copilot+ PCs, and therefore are including NPUs that can reach a certain level of power in order to satisfy Microsoft’s seemingly arbitrary requirements. AMD and Intel were already including AI co-processors on their chips prior to Copilot+ — but those co-processors are basically useless now that we have new, much higher requirements. It’s impossible to say if AMD and Intel would design their processors differently had it not been for the Copilot+ push. At the moment, however, we have a piece of silicon that doesn’t serve much of a purpose on Ryzen AI 300 and eventually on Lunar Lake. It calls back to Intel’s push with Meteor Lake, which have become all but obsolete in the face of Copilot+ requirements. As AMD and Intel have both promised, they’ll eventually be brought into the Copilot+ fold. At the moment, only Qualcomm’s Snapdragon X Elite chips have Microsoft’s stamp of approval, but AMD, at least, says that its chip will be able to access Copilot+ features before the end of the year. That’s the other problem, though — we don’t really have any Copilot+ features. The star of the show since Microsoft announced Copilot+ has been Recall, and there hasn’t been a single person outside of press that’s been able to use it. Microsoft delayed it, restricted it to Windows Insiders, and by the time Copilot+ PCs were ready to launch, pushed it back indefinitely. AMD and Intel might be brought into the Copilot+ fold before the end of the year, but that doesn’t mean much of anything if we don’t have more local AI features. We’re seeing some consequences of Microsoft’s influence on the PC industry in action. We have a slate of new chips from Qualcomm and AMD, and soon Intel, all of which have a piece of silicon that isn’t doing a whole lot. It feels rushed, similar to what we saw with Bing Chat, and I wonder if Microsoft is really as committed to this platform as it says. That’s never mind the fact that the thing driving sales of Copilot+ PCs aren’t AI features but better battery life. In the next few years, it’s estimated that a half-billion AI-capable laptops will be sold, and that by 2027 they’ll make up more than half of all PC shipments. It’s clear why Microsoft and the wider PC industry is pushing into AI so hard. But when it comes to the products we have today, it’s hard to say they’re as essential as Microsoft, Intel, AMD, and Qualcomm would have you believe. It’s still important to look at the why in this situation, however. We have a classic chicken-and-egg problem with AI PCs, and even with the introduction of Copilot+ and delay of Recall, that hasn’t changed. Intel, AMD, and Qualcomm are trying to lay the groundwork for AI applications that will exist in the future, when, hopefully, they’re so seamless with how we use PCs that we don’t even think about having an NPU. It’s not a crazy idea — Apple has been doing this exact thing for years, and Apple Intelligence feels like the natural progression of that. That’s not where we’re at right now, though, so if you’re investing in an AI PC, you have to prepare for the consequences of being an early adopter. There aren’t a ton of apps that can leverage your NPU, unless you really go searching, and even in apps with local AI features, they’d prefer to run on your GPU. That’s not to mention some goalpost moving we’ve already seen with Copilot+ and the initial wave of NPUs from AMD and Intel. I have no doubt that we’ll get there — there’s simply too much money being thrown at AI right now for it not to become a mainstay in PCs. I’m still waiting to see when it’s truly as essential as we’ve been led to believe, though. If you bought into the promise of a new AI-charged Copilot+ PC with the latest-gen Intel or AMD processor, and found a few tricks missing, the long wait is over. A handful of those Copilot+ features are finally expanding beyond machines with a Snapdragon X series processor inside.  It’s roughly been a year since Microsoft introduced the Copilot+ PC label, a new breed of computing machines that put AI performance at the forefront. For months, Qualcomm was the sole silicon supplier for such machines.  Microsoft is late to the party, but it is finally bringing a deep research tool of its own to the Microsoft 365 Copilot platform across the web, mobile, and desktop. Unlike competitors such as Google Gemini, Perplexity, or OpenAI’s ChatGPT, all of which use the Deep Research name, Microsoft is going with the Researcher agent branding.
The overarching idea, however, isn’t too different. You tell the Copilot AI to come up with thoroughly researched material on a certain topic or create an action plan, and it will oblige by producing a detailed document that would otherwise take hours of human research and compilation. It’s all about performing complex, multi-step research on your behalf as an autonomous AI agent.
Just to avoid any confusion early on, Microsoft 365 Copilot is essentially the rebranded version of the erstwhile Microsoft 365 (Office) app. It is different from the standalone Copilot app, which is more like a general purpose AI chatbot application.
Researcher: A reasoning agent in Microsoft 365 Copilot
How Researcher agent works?
Underneath the Researcher agent, however, is OpenAI’s Deep Research model. But this is not a simple rip-off. Instead, the feature’s implementation in Microsoft 365 Copilot runs far deeper than the competition. That’s primarily because it can look at your own material, or a business’ internal data, as well.
Instead of pulling information solely from the internet, the Researcher agent can also take a look at internal documents such as emails, chats, internal meeting logs, calendars, transcripts, and shared documents. It can also reference data from external sources such as Salesforce, as well as other custom agents that are in use at a company.
“Researcher’s intelligence to reason and connect the dots leads to magical moments,” claims Microsoft. Researcher agent can be configured by users to reference data from the web, local files, meeting recordings, emails, chats, and sales agent, on an individual basis — all of them, or just a select few. In a Windows Insider blog post, Microsoft announced an AI upgrade to Windows Search to make finding photos, documents, and settings easier. However, the enhanced feature is restricted to Copilot+ PCs with Snapdragon processors; AMD and Intel support is coming soon. The update has all the same fixes and improvements from build 26100.3613; nonetheless, some of the best Copilot+ PCs will be left out simply because they use  AMD or Intel processors. Microsoft will release the update gradually, and you can also take advantage of both semantic and lexical indexing to search for your photos and documents more efficiently. Because of this improvement, you don’t have to remember the exact file name you’re looking for, which is a huge time-saver.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe
- Advertisement -spot_img

Latest Articles