5 min read
|
Saved February 14, 2026
|
Copied!
Do you care about this?
Google is developing an initiative called TorchTPU to make its Tensor Processing Units (TPUs) fully compatible with PyTorch, aiming to reduce reliance on Nvidia's software. This collaboration with Meta seeks to enhance TPU adoption among AI developers who typically use Nvidia's CUDA. Google is also considering open-sourcing parts of the software to accelerate customer uptake.
If you do, here's more
Google is launching an initiative called TorchTPU to enhance the compatibility of its Tensor Processing Units (TPUs) with PyTorch, a popular AI software framework. This move aims to weaken Nvidia's dominance in the AI computing market, where its CUDA software has been a significant barrier for companies considering alternatives. By making TPUs more developer-friendly for those already using PyTorch, Google hopes to attract more customers to its AI chips, which are increasingly vital for its cloud revenue.
The collaboration with Meta is a key aspect of this strategy. Meta, a major supporter of PyTorch, is reportedly negotiating for greater access to TPUs to reduce its reliance on Nvidia's GPUs. Google has shifted to selling TPUs directly to customers rather than limiting them to its cloud services, which opens new pathways for adoption. The initiative reflects a broader need for Google to demonstrate the profitability of its AI investments, especially as it develops products like the Gemini chatbot.
Questions about this article
No questions yet.