4 min read
|
Saved February 14, 2026
|
Copied!
Do you care about this?
Google Cloud has introduced its Axion CPUs and Ironwood TPUs, designed for efficient training and inference of AI models. The Ironwood TPUs offer significant performance advantages over Nvidia's systems, while the Axion CPUs enhance general-purpose computing capabilities for various workloads.
If you do, here's more
Google has launched its Axion CPUs and seventh-generation Ironwood TPUs, targeting AI training and inference with a focus on scaling large models. Ironwood TPUs deliver impressive performance metrics, boasting 4,614 FP8 TFLOPS and interconnecting through a proprietary 9.6 Tb/s network that supports a massive 42.5 FP8 ExaFLOPS when fully scaled. This outpaces Nvidia's GB300 system significantly, which only achieves 0.36 ExaFLOPS. The Ironwood pods can cluster into Google's AI Hypercomputer, an integrated platform that merges compute, storage, and networking, enhancing reliability through Optical Circuit Switching technology.
The Axion CPUs mark Google's entry into in-house designed general-purpose processors based on the Armv9 architecture. While specifics like core count and clock speeds are not disclosed, Axion aims for a 50% performance boost and 60% energy efficiency compared to current x86 CPUs. Early reports suggest it features 2 MB of L2 cache per core and supports high-capacity DDR5 memory. Google has introduced three configurations of Axion instances, with the C4A being the most accessible option currently available. This model offers up to 72 vCPUs and 576 GB of memory, optimized for various applications.
In summary, these releases expand Google's custom silicon portfolio, which has evolved over a decade. The integration of Axion and Ironwood technologies positions Google to compete more aggressively in the AI accelerator market, providing substantial performance and efficiency advantages to enterprise customers. Companies like Anthropic and Lightricks are already adopting this technology for significant operational improvements.
Questions about this article
No questions yet.