npu-feature-image

Everything You Need To Know About AI & NPUs

Can traditional PCs run AI features without an NPU? Read on to find that answer and more.

For some time now I’ve been wondering to what extent an NPU (Neural Processing Unit) is required to run AI features, so I decided to research the topic. I admit that some of what I’ve read is a little above my pay grade, but I do now have a much better understanding of the relationship between AI and NPUs. Here then, in layman’s terms, is the lowdown.

TOPS And NPUs

First up you’ll need to know what TOPS (Trillian Operations Per second) means regarding NPUs. TOPS is a measurement scale used to define an NPU’s capabilities. Currently, in order for manufacturers to meet Microsoft’s licensing requirement for Copilot+ PCs, the NPU must be capable of achieving a minimum of 40 TOPS, which equates to 40 trillion operations per second.

Can Traditional PCs Run AI Without An NPU?

Yes, most assuredly they can, especially high-end machines fitted with a GPU. In fact, benchmarking tests have proved that a mid to high-end GPU can be up to 6 times faster than an NPU (40 TOPS) when running AI-driven features/applications. However, this scenario also involves two critical issues- power consumption and heat generation.

When running AI-driven features/applications an NPU consumes a lot less power and generates way less heat than a GPU. This becomes particularly critical if/when running AI processes for extended periods.

Then there is the multi-tasking/workload-sharing factor. When running AI processes via a CPU and GPU and then trying to run other tasks simultaneously, performance would likely take a hit, to the point where it might even become impossible. On the other hand, because an NPU handles all AI processes exclusively it means the CPU and GPU are available for running other tasks with zero, or very little, degradation in performance.

NOTE: While 40 TOPS is the current minimum requirement for an NPU it’s important to note that manufacturers are working on increasing NPU’s computational power and it is almost certain that eventually, dedicated NPUs will outperform even high-end GPUs.

BOTTOM LINE:

It’s quite apparent that, when it comes to running AI processes, the NPU is not the be-all and end-all that Microsoft would have us believe. That said, it’s early days in NPU development and I’ve no doubt that the NPU is set to become an integral component.

At the moment, Copilot+ PCs are restricted to laptops only. However, Microsoft is working with manufacturers to introduce NPUs into the desktop arena, with the general consensus that we’ll start seeing Copilot+ Desktop PCs sometime next year.

Most pundits are also predicting that Copilot+ PCs will become the norm by 2026 and, when one considers the history of dumb TVs versus smart TVs, it’s difficult to disagree.

Do you own a Copilot+ PC or are you considering purchasing one? Let us know in the comments.

Leave a Comment

Your email address will not be published. Required fields are marked *

Exit mobile version