Open-source machine learning
About PyTorch 2.0
The open source PyTorch project is among the most widely used technologies for machine learning (ML) training. Originally started by Meta, PyTorch 1.0 came out in 2018 and benefitted from years of incremental improvements.
A goal for the PyTorch project is to make training and deployment of state-of-the-art transformer models easier and faster.
Transformers are the foundational technology that has helped to enable the modern era of generative AI, including OpenAI’s models such as GPT-3 (and now GPT-4). In PyTorch 2.0 accelerated transformers, there is high-performance support for training and inference using a custom kernel architecture for an approach known as scaled dot product attention (SPDA).