Click any tag below to further narrow down your results
Links
The article announces Feast, an open-source feature store, joining the PyTorch ecosystem. It highlights how Feast addresses data inconsistencies between training and serving environments, facilitating smoother transitions from model development to production. Key features include declarative definitions, low-latency serving, and integration with various data infrastructures.
Google is developing an initiative called TorchTPU to make its Tensor Processing Units (TPUs) fully compatible with PyTorch, aiming to reduce reliance on Nvidia's software. This collaboration with Meta seeks to enhance TPU adoption among AI developers who typically use Nvidia's CUDA. Google is also considering open-sourcing parts of the software to accelerate customer uptake.
The PyTorch Foundation has added Ray, an open source distributed computing framework, to its projects. This move aims to simplify AI workload management and enhance efficiency across various applications. Ray will work alongside PyTorch and vLLM, offering a cohesive environment for developers.
Google is teaming up with Meta to create “TorchTPU,” a software initiative designed to help developers transition from Nvidia's hardware to Google's TPUs while using the PyTorch framework. This collaboration aims to reduce reliance on Nvidia's software tools. Google is also planning to sell TPUs worth billions to Meta.
ExecuTorch is a tool for deploying AI models directly on devices like smartphones and microcontrollers without needing intermediate format conversions. It supports various hardware backends and simplifies the process of exporting, optimizing, and running models with familiar PyTorch APIs. This makes it easier for developers to implement on-device AI across multiple platforms.
PyTorch Day France on May 7 in Paris marks the inaugural event in a new international series aimed at showcasing advancements in open source AI and fostering community collaboration. Attendees will hear from industry leaders and participate in technical sessions covering a range of AI topics, alongside the GOSIM AI Paris event. Registration is free with a special code for access to all sessions.
PyTorch Conference 2025 will take place in San Francisco on October 22-23, featuring keynotes, technical sessions, and workshops dedicated to AI advancements. The event includes a range of summits on topics like measuring intelligence and AI infrastructure, as well as training and certification opportunities. Attendees will connect with leaders and innovators in the AI community.
PyTorch has evolved from an AI research framework to a foundational tool for production and generative AI, supported by major industry players. The PyTorch Foundation is expanding to encompass a broader ecosystem, addressing current challenges in AI while aiming to establish itself as the "Open Language of AI." Future initiatives will focus on improving performance, model deployment, and fostering a diverse community around AI development.
The article introduces the PyTorch Native Agentic Stack, a new framework designed to enhance the development of AI applications by providing a more efficient and integrated approach to leveraging PyTorch's capabilities. It emphasizes the stack's ability to simplify the implementation of agent-based systems and improve overall performance in machine learning tasks.
PyTorch Conference 2025 will take place in San Francisco from October 22-23, featuring keynotes, workshops, and technical sessions focused on advancements in AI. The event includes co-located summits and the launch of PyTorch training and certification, aimed at connecting AI innovators and practitioners. Session recordings and presentation slides will be available for attendees to review after the conference.
The article discusses the implementation of Andrej Karpathy's original recurrent neural network (RNN) code using PyTorch, emphasizing hands-on coding to understand RNNs better. It also highlights the differences in dataset formatting for training RNNs compared to transformer-based language models. Future posts will delve deeper into the author's personal implementations of RNNs.