Check out our Monthly Survey Page to see what our users are running.
We do often include affiliate links to earn us some pennies. See more here.

Oh yay, more AI stuff. Intel have announced that Lunar Lake is due to arrive in Q3 2024 and it will include a neural processing unit (NPU) for AI fans. This will be their latest line of CPUs designed for laptops, so even if you're not into AI there's a good bet if you go for a new laptop next yeah that it may have Intel Lunar Lake.

"With breakthrough power efficiency, the trusted compatibility of x86 architecture and the industry’s deepest catalog of software enablement across the CPU, GPU and NPU, we will deliver the most competitive joint client hardware and software offering in our history with Lunar Lake and Copilot+" – Michelle Johnston Holthaus, Intel executive vice president and general manager of the Client Computing Group

Intel say that starting in Q3 2024 ready for the holiday season, more than 80 new laptop designs will arrive from 20 different vendors.

It's not all AI though of course, these new chips will come with an improved CPU and GPU, with Xe2 graphics so we should hopefully see a good boost in efficiency and power for those of you on the go. As Intel say: "Lunar Lake is expected to be a groundbreaking mobile processor for AI PCs with more than 3 times the AI performance compared with the previous generation. With more than 40 NPU tera operations per second (TOPS), Intel’s next generation processors will provide the capabilities necessary for Copilot+ experiences coming to market. In addition to the higher performing NPU, Lunar Lake will also be equipped with over 60 GPU TOPS delivering more than 100 platform TOPS."

Get ready for a while lot more AI AI AI to come out of Intel, and AMD too.

Article taken from GamingOnLinux.com.
7 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
11 comments
Page: 1/2»
  Go to:

Luke_Nukem May 21
> deepest catalog of software enablement across the CPU, GPU and NPU

On Windows mostly I bet.
LoudTechie May 21
Quoting: Luke_Nukem> deepest catalog of software enablement across the CPU, GPU and NPU

On Windows mostly I bet.

Actually Intel's Linux support is quite good.
Open drivers, lots of Linux clients and active(proprietary) collaboration in Kernel development.
Also the large AI players all run Linux, so making AI stuff windows exclusive is asking for problems if you're not microsoft.
Linux_Rocks May 21
Moar AI? Thanks, I hate it.
Cybolic May 21
I don't understand why NPUs are being put into CPUs and GPUs at this stage. For fully custom systems/ecosystems, like the Apples, sure, I get it, but for everything else, the tech doesn't seem evolved enough to put into such main parts of a system, when it's going to need an upgrade in 6 months anyway.
Why aren't they just daughterboards until we reach a baseline for performance?

P.S. I don't get the negativity surrounding "AI" processing units - they just help process models locally. Just like a GPU isn't necessarily used to play pirated games, a desktop-focused NPU has a similarly small chance of being used to mimic other people's work without permission or whatever other nefarious uses some companies think up for the tech.


Last edited by Cybolic on 21 May 2024 at 4:38 pm UTC
I don't see the applicability of a specialized "AI" doohickey to general purpose computing. How is this going to help me watch or even process video, or play games, or do a spreadsheet, or browse the web? And if it isn't, why is it a selling point that I have to pay for it?

But it's not like I know anything; I didn't know there would be a point to specialized hardware bits even for doing actual AI stuff.
Cybolic May 21
Quoting: Purple Library GuyI don't see the applicability of a specialized "AI" doohickey to general purpose computing. How is this going to help me watch or even process video, or play games, or do a spreadsheet, or browse the web? And if it isn't, why is it a selling point that I have to pay for it?

But it's not like I know anything; I didn't know there would be a point to specialized hardware bits even for doing actual AI stuff.
Some of this is actually what I'm trying to use it for on my machine. For spreadsheets, it's nice to be able to say "make a column that does A and B calculations on X and Y" and for web, I've been poking at auto-categorising my bookmarks into sensible folders - same thing with my growing collection of notes. One of the reasons I haven't gotten that far with it, is precisely because I'd prefer not to send my data to a third-party service, so running it on local hardware (with decent speeds) would be very nice.
danniello May 22
If I'm not wrong chatgpt.com / copilot.microsoft.com it is website/cloud service. The same like google.com search. It cannot work off-line by design (without downloading all bigdata - so in case of large AI models like ChatGPT/Copilot it is impossible, because it would require download the internet:)

So what is the point for local NPU units? Except of course PR to sell additional fake features, i.e. some functions of website copilot.microsoft.com will be unlocked only when local NPU will be detected, but it is fully artificial limitation - some kind of DRM...
Cybolic May 23
Quoting: dannielloIf I'm not wrong chatgpt.com / copilot.microsoft.com it is website/cloud service. The same like google.com search. It cannot work off-line by design (without downloading all bigdata - so in case of large AI models like ChatGPT/Copilot it is impossible, because it would require download the internet:)

So what is the point for local NPU units? Except of course PR to sell additional fake features, i.e. some functions of website copilot.microsoft.com will be unlocked only when local NPU will be detected, but it is fully artificial limitation - some kind of DRM...
You can very well run models similar to the ones you mentioned locally. Some of the common uses that you might already find in your phone, for example, is stuff like assisted photo manipulation and handwriting recognition.

None of the models require you to download the internet, but some of the bigger models can be gigabytes in size. Remember, they don't contain the data they're trained on, but something more like the common links between each piece of data.

If you're curious to give it a try, I can recommend ollama as an easy way to get started.
Musang May 23
I think there actually is a big use case for professionals working on AI tech. While we may or may not have a heavy duty AI machine for training, I can tell you from experience that a lot of development happens locally, often on a portable machine in open spaces, often at home during work-from-home. So having reasonable performance on a portable form factor can be quite attractive for a modern dev workforce.
Luke_Nukem May 23
Quoting: LoudTechie
Quoting: Luke_Nukem> deepest catalog of software enablement across the CPU, GPU and NPU

On Windows mostly I bet.

Actually Intel's Linux support is quite good.
Open drivers, lots of Linux clients and active(proprietary) collaboration in Kernel development.
Also the large AI players all run Linux, so making AI stuff windows exclusive is asking for problems if you're not microsoft.

Yeah sure. But I'm talking "software enablement". That's quite different to driver stacks.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register


Or login with...
Sign in with Steam Sign in with Google
Social logins require cookies to stay logged in.