The article introduces torchcomms, an experimental communications API for PyTorch designed to enhance large-scale distributed training. It aims to provide core communication primitives, support for over 100,000 GPUs, fault tolerance, and efficient communication patterns, all while being developed openly to encourage community feedback and innovation.