Report claims Arm chips will power 90% of AI servers based on custom processors in 2029 — x86 and RISC-V on the outside looking in
Report claims Arm chips will power 90% of AI servers based on custom processors in 2029 — x86 and RISC-V on the outside looking in
Virtually all hyperscale cloud service providers (CSPs), as well as some of the leading developers of AI accelerators nowadays, have their own custom-silicon programs that are focused not only on developing AI accelerators, but also on custom general-purpose CPUs usually based on the Arm instruction set architecture (ISA). Over the next several years proliferation of custom CPUs based on the Arm ISA inside AI servers will increase to 90%, leaving x86 and Arm around 10%, according to Counterpoint Research.
x86 processors from AMD and Intel have long dominated general-purpose servers, which is why most of the AI servers initially relied on Opteron and Xeon processors. However, Arm-based custom CPUs that are tailored for specific data-intensive AI workloads are more cost and power-efficient. Furthermore, given the fact that AI workloads are emerging workloads, backward compatibility with x86 is not vital. To that end, AWS, Google, a nd Microsoft have developed their own proprietary Arm-based processors for their own workloads, whereas Meta is the alpha customer for Arm's own AGI processor.
As a result, adoption is unfolding across multiple hyperscalers in parallel. AWS is expanding the role of its Graviton processors across Trainium-based systems, while still retaining x86 in some configurations for compatibility reasons; Google's next-generation TPU infrastructure relies on its Axion Arm CPU; while Microsoft has paired its Azure Cobalt Arm CPU with its Maia accelerators from the beginning to build a vertically integrated AI infrastructure. Meta is also set to begin deploying Arm's own AGI CPUs shortly.
Article continues below You may like
"The transition from x86 to Arm in AI servers is not a single switch," said Neil Shah, vice president of research at Counterpoint Research. "It has played out generation by generation, configuration by configuration. Hyperscalers are making deliberate choices based on their specific deployment needs, writing compatible and interoperable software, and the economics are very encouraging. The transition is expected to accelerate meaningfully in the second half of 2026, driven by the broad deployment of in-house Arm CPUs alongside next-generation ASIC platforms across major hyperscalers."
Nowadays, the majority of CPUs powering AI servers are still x86, but this is going to change shortly, and by 2030, 90% of AI servers that use custom processors will rely on Arm, leaving only 10% for x86 and RISC-V. It should be noted that loads of AI servers will continue to rely on off-the-shelf EPYC and Xeon processors from traditional suppliers, though broad adoption of Arm by hyperscalers for their custom silicon programs should be a signal for AMD and Intel to make their custom CPU programs more appealing to customers.
"Our analysis projects Arm-based CPUs will account for at least 90% of host CPU deployments in custom AI ASIC servers by 2029, up from around 25% in 2025, a structural shift driven by the accelerating rollout of in-house Arm CPU programs across major hyperscalers," Shah added.
AMD builds its own vertically integrated AI platforms featuring x86 EPYC processors, Instinct MI-series AI accelerators, Pensando DPUs, and Pensando NICs, so it is reasonable to assume that these CPUs are tailored for AI workloads. Meanwhile, Intel is developing custom Xeon processors for Nvidia's next-generation AI platforms, which suggests that these processors will also be optimized primarily for AI workloads. All in all, while Arm will get significantly bigger in the AI server realms over the next four to five years, x86 will continue to command a sizeable share of this market.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
No comments