3 min read
|
Saved February 14, 2026
|
Copied!
Do you care about this?
OpenAI introduced GPT-5.2 and GPT-5.3 Codex, both trained on NVIDIA's infrastructure, showcasing significant performance gains in coding and reasoning tasks. The models achieve top scores on various industry benchmarks, reflecting advancements in AI training techniques. NVIDIA's powerful systems enable faster development cycles for AI applications.
If you do, here's more
OpenAI released GPT-5.2 in December and GPT-5.3 Codex in February, both of which were trained on NVIDIA's advanced infrastructure, specifically the GB200 NVL72 systems. GPT-5.2 achieved top scores in industry benchmarks such as GPQA-Diamond and AIME 2025, establishing a new standard for performance in developing artificial general intelligence (AGI). GPT-5.3 Codex enhances coding capabilities and reasoning, boasting a 25% performance boost over GPT-5.2 across various coding benchmarks.
NVIDIA's architecture, particularly the GB200 NVL72 systems, has shown significant improvements in training efficiency, offering three times faster performance compared to previous models and nearly double the performance per dollar. These advancements enable faster development cycles for AI models. NVIDIA's technology supports not just language models but also diverse applications in speech, image, and video generation. Notable projects like Runway's Gen-4.5 video model and Clara's medical imaging models highlight the practical uses of NVIDIA's infrastructure across different fields.
The article highlights the importance of training infrastructure in scaling AI capabilities, emphasizing the role of NVIDIA's Blackwell platform. This platform is now available from major cloud service providers, ensuring broad access to powerful AI training resources. NVIDIA's ability to support varied AI workloads allows data centers to optimize their resources effectively, making it a preferred choice for numerous AI labs and developers in the industry.
Questions about this article
No questions yet.