Arm Enters Merchant Silicon Market with AI-Focused "AGI CPU"
AI-generated image: synthetic visual, not an actual depiction of events, people, or locations.
Arm Holdings plc (NASDAQ: ARM) announced a historic expansion of its business model today, March 24, 2026, revealing its first-ever production silicon product designed specifically for artificial intelligence infrastructure. The launch of the Arm AGI CPU marks the company’s transition from a pure intellectual property (IP) provider to a direct manufacturer of server chips, a move aimed at capturing the rapidly growing demand for "agentic" AI data centers.
A Historic Strategic Pivot
For more than 35 years, Arm has operated primarily as an IP licensing firm, providing the blueprints for chips used by companies like Apple, NVIDIA, and Amazon. With the introduction of the AGI CPU, the company is now offering its own Arm-designed and branded processors.
"AI has fundamentally redefined how computing is built and deployed," stated Rene Haas, CEO of Arm. "With the expansion into delivering production silicon, we are giving partners more choices to support agentic AI infrastructure at global scale".
Technical Specifications and Performance
The Arm AGI CPU is built on a 3nm lithography process and utilizes the Arm Neoverse V3 platform. It is designed to handle the orchestration and scheduling tasks required by modern AI agents, which often run continuously and coordinate thousands of parallel tasks across accelerators and networking fabrics.
Key technical highlights include:
Core Count: Up to 136 Neoverse V3 cores per chip in a dual-chiplet design.
Memory Architecture: Supports up to 6TB of DDR5-8800 memory per chip with 6GB/s of memory bandwidth per core.
Connectivity: Features 96 lanes of PCIe Gen6 and CXL 3.0 Type 3 for high-speed memory expansion.
Deterministic Performance: A 300W to 420W TDP design that assigns a dedicated core per program thread, intended to eliminate throttling under sustained workloads.
Hyperscale Partnerships and Deployment
Meta served as the lead partner and co-developer for the AGI CPU. The social media giant plans to integrate the chip alongside its own Meta Training and Inference Accelerator (MTIA) silicon for its gigawatt-scale data center deployments. Other early adopters and ecosystem partners include OpenAI, Microsoft, Google, Oracle, SAP, and Cloudflare.
To accelerate adoption, Arm is introducing a 1OU, 2-node reference server design through the Open Compute Project (OCP).
Air-Cooled Systems: A standard 36kW rack can support up to 30 blades, delivering a total of 8,160 cores.
Liquid-Cooled Systems: In partnership with Supermicro, a 200kW design can house 336 CPUs for more than 45,000 cores per rack.
Efficiency and Economic Impact
Arm claims its new silicon delivers more than twice the performance per rack compared to the latest x86 systems from Intel and AMD. According to the company’s estimates, this increased density and efficiency could lead to up to $10 billion in capital expenditure (CAPEX) savings per gigawatt of AI data center capacity.
The Arm AGI CPU is available for order now, with the company committing to a roadmap of follow-on products for the data center market.
This article was generated with the support of our AI agent, which has been rigorously trained under the supervision of well-qualified journalists. While we strive for the highest quality in every article, if you find anything amiss, please contact us to let us know.
RELATED MARKET INTELLIGENCE NEWS
MORE NEWS
HIVE Launches GPU Cloud Cluster in Paraguay Supporting Columbia University AI Research
6d ago

NVIDIA forms Nemotron Coalition to promote open frontier AI research
Mar 16, 2026

NVIDIA implements Dynamo inference OS into operational use for AI deployment
Mar 16, 2026

Cango Reports $452.8 Million Net Loss for 2025 Amid AI Infrastructure Pivot
Mar 16, 2026

NVIDIA Expands Physical AI Ecosystem with New Models and Industrial Partnerships
Mar 16, 2026

NVIDIA releases new open models to support autonomous and healthcare AI applications
Mar 16, 2026
