Click any tag below to further narrow down your results
Links
NVIDIA's new GB200 NVL72 AI cluster has increased the performance of Mixture of Experts (MoE) models by ten times compared to its previous generation. This boost is attributed to a co-design approach that enhances parallel processing and optimizes resource allocation for AI tasks. The Kimi K2 Thinking model, tested on this architecture, showcases significant improvements in efficiency and capability.