1 min read
|
Saved February 14, 2026
|
Copied!
Do you care about this?
Epoch AI has released a data explorer that estimates the sales and capacity of AI chips from major vendors like Nvidia and Google. It provides insights into global AI compute capacity and highlights the significant costs and power demands associated with these chips.
If you do, here's more
Epoch.ai has launched an AI Chip Sales Data Explorer, designed to provide insights into the sales and capacity of AI chips. The tool collates data from various sources, including financial reports and company disclosures, to estimate the availability of AI computing hardware. Key players like Nvidia, Google, Amazon, AMD, and Huawei are included, with detailed breakdowns by chip model.
According to their findings, global AI compute capacity has surged to the equivalent of over 15 million Nvidia H100 GPUs. The introduction of new chip generations, particularly Nvidia's Blackwell generation, has significantly changed the market dynamics. The Blackwell chips, especially the B300 model, now dominate Nvidia's sales, displacing earlier models like the H100 and H200.
The financial implications of acquiring these chips are substantial. The purchase costs have escalated to tens of billions of dollars per quarter, excluding additional expenses for infrastructure like data centers. Furthermore, the power consumption of these chips is staggering; the total tracked chips require more than 10 gigawatts of power, roughly double the energy consumption of New York City. This data explorer aims to shed light on these trends, making it easier to understand the evolving landscape of AI computing capacity.
Questions about this article
No questions yet.