For over 35 years, arm holdings has licensed its instruction sets to the world’s largest chipmakers and collected royalties on every processor made from its designs. Now the UK-based company is making its own physical silicone for the first time.
Arm CEO René Haas unveiled his company’s first in-house chip at an event in San Francisco on Tuesday. Arm is calling the new data center central processing units AGI CPUs. It’s a long-awaited move that marks a major change for the so-called Switzerland of chip companies as it enters new competition with its customers.
meta It is the first company to sign on, as the social media company is building a multi-gigawatt AI data center and plans to spend up to $135 billion on capital spending this year. In February, Meta acquired huge quantities of chips from both Nvidia and advanced precision instruments.
“In today’s world, you really only have a few players,” said Meta Software engineer Paul Saab, who has helped the Arm chip project since its beginning in 2023, in an interview with CNBC. “It adds another player to the ecosystem for us.”
Saab said the Arm deal “allows for a lot more flexibility in our software stack and our supply chain.”
Terms of the agreement were not disclosed. For Arm, the deal represents a major win and the seal of approval of one of the world’s most valuable companies.
“Let’s say they’ll get 5% of Meta’s $115 to $135 billion of capital spending in the future,” said Patrick Moorhead, chip analyst at Moor Insights. “This is a game changer for them at the top line.”
It’s also the latest sign that CPU demand is seeing a resurgence. Nvidia, which has established itself as a leader in AI graphics processing units, recently told CNBC that CPUs are “becoming the bottleneck” as agentic AI compute requirements change. Futurum Group predicts what it calls a “quiet supply crisis” CPU market growth rate may exceed GPU Development till 2028.
While GPUs are ideal for training and running AI models because their thousands of cores can perform multiple operations simultaneously, CPUs have smaller numbers of powerful cores running sequential general-purpose tasks. Agent AI requires a lot of general computing power, with large amounts of data moving between multiple agents.
At Nvidia’s annual GTC conference last week, CEO Jensen Huang unveiled an entire rack filled with Vera CPUs alone. At the Arm event on Tuesday, Huang appeared in a recorded statement congratulating Arm on its new CPUs.
from top leaders Google, Amazon, Microsoftoracle, broadcom, micron, SAMSUNG, SK Hynix And marvel Also seen in the video. Arm told CNBC that about 50 partners indicated support ahead of the launch.
“It’s a $1 trillion market, and what we’re seeing again and again is that our partners are coming forward and understanding and realizing that this is really great for the industry,” Mohamed Awad, Arm’s cloud AI chief, told CNBC in an interview.
CNBC got the first exclusive look at Arm’s new chip lab, where it is preparing the new CPU for full production later this year.
Arm’s head of cloud AI Mohammed Awe gives CNBC’s Katie Tarasov a tour of the chip lab where it built its first in-house chip, an AGI CPU, in Austin, Texas, on Friday, March 6, 2026.
Erin Black | cnbc
‘Make the chip you want’
Arm spent $71 million and about 18 months building three new lab rooms on its campus in Austin, Texas, where a small team has grown to more than 1,000 people today. Inside, engineers “bring” the chips through several rounds of testing after they come off the factory line.
Like almost all fabless AI chip makers, Arm currently manufactures its cpu Taiwan Semiconductor Manufacturing CompanyManufacturing plant. Built on TSMC’s 3-nanometer node, Arm’s CPUs are now entirely manufactured in Taiwan. TSMC’s 3nm fab is coming soon to Arizona, and Awad said Arm “would love to manufacture here. It really depends on what our customers are ultimately looking for.”
Arm is known as the leading architecture for mobile chips in almost every smartphone. It moved into data center chips in 2018 with the launch of its Neoverse platform. Amazon took Neoverse mainstream in its first custom processor, the Graviton, and Google and Microsoft also now base their AI chips on Arm.
“If Arm didn’t exist, all the companies that have their own processors wouldn’t be able to make their own processors,” Moorhead said.
Nevertheless, most server chips are built on the traditional x86 architecture that is used intel And amd. Moorhead calls x86 “tried and true” and says it “can run almost anything.”
The advantage of the Arm architecture, Moorhead said, is that it is “super efficient” because the designs are more customizable. “You can make only the chip you want without anything else.”
Awad told CNBC that the Arm team “ruthlessly optimized” its new AGI CPU for artificial general intelligence — hence the name. Up to 64 of the new CPUs, totaling about 8,700 cores, can fit in a single air-cooled rack. it is dense Configuration Arm believes this will appeal to power-constrained data center customers around the world.
“You can get twice the performance-per-watt compared to x86 racks,” Awad said. “That means twice the performance in the same footprint, same power.”
Meta’s Saab said that wattage is “a very scarce resource.”
“If you have a best-in-class CPU that’s giving you the best per-watt performance that you possibly can, it opens up more wattage for other parts of your infrastructure,” he said.
META’s 5-gigawatt Hyperion data center under construction in Richland Parish, Louisiana, January 9, 2026.
courtesy of meta
‘Available to the whole world’
Meta has a big need for efficiency as it is building massive AI data centers in Louisiana, Ohio and Indiana. there is also a company Allegedly Texas giant Stargate wants to lease space at the site, where OpenAI and Oracle have scrapped plans to expand to 10GW capacity.
Meta’s AI spending has increased after its Llama 4 model was not well received by developers last year.
“They got behind,” Moorehead said. “They also recognized that we don’t have enough compute power to do what we need to do.”
In addition to securing processors from Nvidia and AMD, Meta in March unveiled four new chips within its line of meta training and inference accelerators, which it is building through 2023. Now, it’s adding CPUs from Arm into the mix.
““It was basically meant to be a complete replacement, drop-in replacement for our current compute CPUs, and transparent to our developers,” Saab said.
Saab was at Facebook in 2011 when the company launched the Open Compute Project, a consortium that now has hundreds of member companies, including Arm and Nvidia, committed to opening up hardware designs that help reduce data center energy consumption and costs.
“Our first conversation with Arm was, ‘Hey, if we build this, we don’t want to just keep it within the company,'” Saab said. “We’re not like a chip company that’s trying to create a sales channel to sell chips. We wanted it to be available to the whole world.”
Arm won’t disclose the price of the CPU, but Moorhead estimates it to be in the thousands of dollars.
Awad told CNBC it will be “competitively priced,” with the aim of serving as an alternative for companies that can’t afford it. Can afford to build your own in-house processor.
“You have to have 1,000 engineers, a $500 million dollar budget to build it,” Moorhead said. “So there’s definitely a market need.”
Watch: Inside Arm’s $71 million chip lab where it’s building its first CPUs
