The simplest, fastest repository for training/finetuning medium-sized GPTs.

About nanoGPT

nanoGPT is a rewrite of minGPT that prioritizes teeth over education. Still under active development, but currently the file reproduces GPT-2 (124M) on OpenWebText, running on a single 8XA100 40GB node in 38 hours of training. The code itself is plain and readable: is a ~300-line boilerplate training loop and a ~300-line GPT model definition, which can optionally load the GPT-2 weights from OpenAI. That's it.

nanoGPT is created by Andrej Karpathy, a legendary AI researcher, engineer, and educator. He’s the former director of AI at Tesla and a founding member of OpenAI.


nanoGPT screenshots

Ready to start building?

At Apideck we're building the world's biggest API network. Discover and integrate over 12,000 APIs.

Check out the API Tracker