The simplest, fastest repository for training/finetuning medium-sized GPTs.
nanoGPT is a rewrite of minGPT that prioritizes teeth over education. Still under active development, but currently the file train.py reproduces GPT-2 (124M) on OpenWebText, running on a single 8XA100 40GB node in 38 hours of training. The code itself is plain and readable: train.py is a ~300-line boilerplate training loop and model.py a ~300-line GPT model definition, which can optionally load the GPT-2 weights from OpenAI. That's it.
nanoGPT is created by Andrej Karpathy, a legendary AI researcher, engineer, and educator. He’s the former director of AI at Tesla and a founding member of OpenAI.