Expert Insights: The Rise of the AI PC

With the rise of artificial intelligence (AI)-driven applications, talk of a major AI-driven hardware refresh is growing louder, and AI-enabled technologies continue transforming facets of the tech market. While AI opportunities in the data center have been proven out over the last couple of years, the narrative is moving toward the edge market as the next area of growth.

Big tech is certainly taking notice. Both Microsoft and Apple are already central to said narrative, though each has taken a different approach to moving AI into the mobile ecosystem. While we are still in the beginning stages of this development, every major computer manufacturer has already dubbed the “AI PC” as the next big product cycle.

In this blog, we investigate the term “AI PC” and its implications for the tech industry by leveraging expert perspectives found in the AlphaSense expert transcript library

What is an AI PC? 

While definitions of an AI PC can differ depending on the manufacturer and the end goal, it’s clear that a standard AI PC will include key features, such as specialized components, higher random access memory (RAM), and larger storage. Among these components will be a neural processing unit (NPU), central processing unit (CPU), and graphics processing unit (GPU). 

An AI PC is a personal computer specifically designed to handle artificial intelligence tasks. These tasks often require significant computational power, so AI PCs are typically equipped with high-performance CPUs and GPUs that can manage the intensive data processing demands of AI applications. The GPUs are crucial for accelerating tasks like machine learning and deep learning, while the CPUs help manage the overall processing workload.

Additionally, these computers require large amounts of RAM and storage to support the handling of large datasets and complex algorithms, which makes them well-suited for developers, researchers, and businesses working on AI projects.

To be considered an AI PC, the unit will also require 16GM RAM, 256GM storage, and an NPU capable of 40 TOPS, or trillions of operations per second. Many AI PC original equipment manufacturers (OEMs) are also expected to include an AI-powered digital assistant from Microsoft Copilot. While Apple’s plans remain more vague, many expect its homegrown Apple Intelligence to be prominently featured.

AI PC Laptops will likely include Qualcomm’s Snapdragon C Elite and Plus processors, AMD’s Strix Point (Ryzen AI 300), or Intel’s Lunar Lake chips. Apple’s laptops are expected to include the in-house M4 chip, which contains an equivalent to the NPU, dubbed a “neural engine” that we explore later in this section.

“For the first half, ASUS has undergone significant changes leverage AI solutions to increase our overall operational capabilities, and we have achieved significant financial results for the second half of 2024, we will be leading the first wave of AI PC product cycles, developing multiple AI platforms and solutions and to aid in the development of the entire AI revolution in order to secure our lead into the PC market.

Entering the age of AI, ASUS has a comprehensive slogan of AI and incredible possibility which means that we want to create AI solutions that will cover everything from public cloud to the private cloud from general AI to personalized AI and to everything, including plug-ins and personalized experiences.”

– Asustek Computer Inc. | Earnings Call, Q2 2024

Much like the traditional PC, an AI PC will likely retain the same user base, namely the consumer and enterprise customers. The expectation is that with AI hardware and software enhancements, users will be able to increase productivity and utilize yet undefined AI applications in a secure environment with better performance and longer battery life.

These enhancements will enable consumers to use AI PCs for real-time language translation, image generation, photo editing, video editing, content creation, fraud detection, security, and data analysis.

Understanding NPUs and Neural Engines

NPUs are not a new phenomenon; companies like Apple have been using them for several years. NPUs specialize in handling complex mathematical computations, such as the ones AI and machine learning accomplish. The NPU drives the acceleration of on-device processing for the large language model (LLM) being utilized.

TOPS is an acronym that stands for “tera operations per second” or “trillion operations per second” and is a metric for measuring the performance of NPUs. Apple’s newest M4 is quoted at 38 TOPS and offers roughly two times the performance of the M2 neural engine. Qualcomm (Snapdragon Elite X), Intel (Lunar Lake), and AMD (Strix Point) have NPU offerings with various availability and specs, while Apple has its own M4 processor based on in-house silicon.

All the chip companies producing NPUs are expected to achieve over 40 TOPS in their next iterations, as suggested in the table below: 

expert insights the rise of ai pc comparison of ai pc chips

Comparison of AI PC chips between OEMs, according to Fubon Research.

One expert suggests that using an NPU for AI tasks is much faster and more efficient than using a general-purpose CPU, allowing the CPU to focus on other tasks:

“Right now, if you can free up resources on your CPU and push them into the NPU because the NPU is much more suitable for those functions, that’s the way to go. That’s exactly what the latest OS and the latest PCs AI-enabled is trying to do for us.”

– Private Networking Sales, Microsoft | Expert Transcript

However, experts aren’t yet completely aligned on the criticality of the NPU to an AI PC:

“To be honest, the NPU isn’t that efficient either. It still can’t run without the CPU. Basically, what happens is the local OS will get a request and normally we go back through a wireless network to start with into the cloud and run stuff. Now, some of that information will come down and some of it has to be processed local, but the OS will basically go to the CPU and the CPU is running the OS anyway. That’s number one, but then there’ll be another core or processor that’ll run some of the application.

What that processor will do is based on the instructions. It’ll offload some of that to the NPU to try to get better efficiency, but there’s a lot of latency. In some cases, you’re better off not running it on the NPU. In fact, that happens quite a bit. If it can run very efficiently on the NPU, you’re probably going to get 4X to 6X better performance if it can be completely run on the NPU, which it can’t. You get a little bit of speed up in the NPU or even a tensor processor which is a TPU, which is what Google does.

– Former Director, Qualcomm | Expert Transcript

Though it remains to be seen how designs will advance, plans are in motion, and guidance from OEMs reflects optimism around the long-term potential of AI PCs. Additionally, while designs may differ in totality, NPUs, CPUs, and GPUs are viewed as central to an optimal AI PC experience.

The PC Cycle: Will AI Drive a Refresh Cycle?

The PC market decelerated sharply over the 2021 to 2022 timeframe, triggered by inventory digestion and the wane of COVID-related demand uptick in 2020. The PC market bottomed in Q1’23 after the industry digested excess inventory. The pace of recovery over the last four quarters has been slow due to persistent macro headwinds and a lack of innovation.

Enter AI: There are hopes that AI will drive a meaningful cycle starting in the back half of this year. The question remains if this hope will fall victim to AI hype or if there are legs to it, driven by both product innovation and demand. As for the drivers of that demand, our experts have weighed in.

A senior vice president (VP) of HP suggests that edge AI computing and declining reliance on cloud computing will drive higher demand for AI-capable devices: 

“It’s a few things, the desire to keep some of the data local versus pushing it out in the cloud, you don’t know what’s going to happen, etc. Is it secure? Is it not secure? You don’t want to have that out there. The second piece is speed. Simply you don’t have any latency effects.

The third piece is also cost element of it. If you really do a lot of even training in a small part of inference, etc., and you do locally and you get the right utilization, you get actually a lower cost structure as well. Those three, I think they will drive essentially AI PCs.”

– Senior VP, HP Inc. | Expert Transcript

Additionally, a former vice president of Dell believes that AI will positively impact the PC market, and that partnerships like Microsoft’s with OpenAI and Qualcomm will lead to advancements in AI capabilities and increased adoption of AI-enabled PCs.

“I think the influence of AI on the PC market is going to be very positive. We are just getting started right now. The reason I think is going to be positive is two reasons. One, there has been a massive effort by Microsoft, which is very public, to partner with OpenAI. Copilot is based on gen AI which is based on the OpenAI partnership that Microsoft has been established. It has a lot of new features that Microsoft has included in the Windows 11 operating system. That just rolled out last month.

On top of that, there’s been a very public partnership with Microsoft and Qualcomm around the ARM processor and the NPEs, neural processing units that really drive the acceleration that is required to do a lot of on-device processing of these large language models that come with gen AI on the device.

With these new capabilities coming out, with NPU based on architecture from Qualcomm and then with new capabilities coming from the Windows OS and all the work that other software companies are doing around AI, I believe that AI is going to help drive additional refresh of PCs, additional fields, and the AI PC as they are calling these next-gen PCs, they are really, I think, going to drive a lot of adoption of AI.”

– Former VP, Dell | Expert Transcript

While global shipments of AI PCs equipped with Qualcomm NPU chips have commenced, it remains early to extrapolate the demand for both enterprise and consumer. Long term, though, it’s suggested that AI PCs could represent a major cycle for hardware and semiconductor companies, and will represent 40% of the PC market in 2026, according to JP Morgan, with a ramp beginning in H2 2024.

In turn, this should track a return to growth for PC makers such as Dell and HP Inc. Companies like Asustek have made their stakes clear, saying the company “anticipates that the PC market will be entering a growth cycle that will last for several years. For 2024, we were the first to introduce cloud platform and highly comprehensive line of AI PCs.”

expert insights the rise of ai pc forecast of annual shipments

Forecast of annual shipments for AI-capable PCs from 2024 to 2028, according to Bluefin Research.

Both Apple and Microsoft are pursuing opportunities to drive revenue through AI-assisted applications. Microsoft is positioning Copilot as an assistant across its suite, including Office 365 (Word, Excel, Powerpoint, Outlook), starting free and including several price points. Meanwhile, Apple Intelligence has been described by the company as “AI for the rest of us,” though it will not be available until the fall of 2024.

Microsoft will end support for Windows 10 in 2025 after ten years in the market, which means no software updates, security patches, or technical support for users. This will drive a majority of users to replace unsupported hardware over time, setting up an opportunity for AI PCs to inflect. Potentially offsetting this opportunity is that current price points sit at two times that of the average PC. Over time and with volume, these costs are likely to come down. JPMorgan expects an incremental $15B market opportunity for AI PC, with penetration at 50% and ASP increases of 15%. 

“In the future, you’re going to pay for the PC, you’re going to pay for the chip and you’re going to pay for the compute, and you’re going to pay Microsoft $30 an hour per user per month. Guess who wins?

Maybe because now Apple is doing the same thing, which is fantastic because now we have two major players to kill inferencing onto the edge. It’s not just a Microsoft thing anymore, it’s an Apple thing as well.”

– Former Director, Microsoft | Expert Transcript

Adoption will also be driven by use cases, the most clear examples being productivity and security applications. 

“The multimodal LLM, which is what, I would say, coming more strongly from the smartphone side, whether it be Apple or Google, depending upon the problem space you’re solving, the multimodality of the models would be of interest in some of those use cases. The landscape is quite evolving along these dimensions, and based upon the requirements, people are then selecting what they are bringing to market.

Another element, of course, is privacy and security. For example, Anthropic, who has Claude, really has been focusing themselves as an enterprise-friendly company. You see them also getting a lot of traction when it comes to enterprise use cases. I would say for anyone designing now, flexibility is key.”

– Former VP, Zebra Corporation | Expert Transcript

One AlphaSense expert is less positive on the near-term use cases of AI PCs: 

“I think AI PC is still up in the air in terms of how good the use cases are. If you’re seeing the amount of investment into it, they’re going to find use cases for it, because I think it’s the future of where these companies see the use cases of what they could potentially do. The traditional use cases of the CPU are just out of gas. I do think that Microsoft is in the lead, but these others need to find ways to differentiate themselves. Microsoft still works with them over the competition.”

– Former Director, Intel Corporation | Expert Transcript

AI PC workloads promise to enhance latency, power consumption, security, and personalization. Workloads can run on the PC or locally rather than in the cloud, mitigating cloud vulnerabilities. This need for security is a theme that has run through the AI ecosystem since its inception. Both Apple and Microsoft are making the pitch as a key differentiating factor, which could be another element to watch as products are rolled out in the coming months.

The future expects Dell, Apple, HP, Acer, Asus, Samsung, and Lenovo to introduce AI PCs to enterprises and consumers. At the same time, companies may join Microsoft Copilot and Apple Intelligence in introducing killer applications driven by AI. Semiconductor companies like Qualcomm, AMD, and Intel will be looking to gain share in the processor segment, while memory players like Micron, Samsung, and Hynix will be looking to gain content on edge devices.

Discover more about the enterprise and consumer adoption of generative AI in our report, Generative AI: The Road To Revolution.

Track Key Trends in PC and AI with Expert Insights

A reaccelerating PC demand environment sets the stage for an exciting time for the hardware sector. With the complexities of AI come more electronic components and a change in the competitive landscape.

AlphaSense Expert Insights reduces the time investment researchers spend finding critical market insights across the technology sector. The platform’s powerful AI search technology reduces time to insight and enables professionals to allocate more time to decision-making and strategy. As the industry continues to innovate and grow, AlphaSense will monitor emerging trends and competitive dynamics through critical perspectives found in our expert transcript library.

Ready to dive deep into the expert transcript library? Start your free trial of AlphaSense today.

ABOUT THE AUTHOR
Michelle Brophy
Michelle Brophy
Director of Research, Tech, Media and Telecom

Michelle Brophy serves as the Director of Research, Tech, Media and Telecom at AlphaSense. Prior to joining AlphaSense, Michelle spent 3 years as the Strategist for TMT at Guidepoint Insights. Prior to this, Michelle spent 18 years on the buy side, in both portfolio manager and senior equity analyst roles, at Hilltop Park Capital and Kingdon Capital Management. Michelle resides in New York City.

Read all posts written by Michelle Brophy