Skip to content
#

gpt-training

Here are 2 public repositories matching this topic...

Language: All
Filter by language

This notebook builds a complete GPT (Generative Pre-trained Transformer) model from scratch using PyTorch. It covers tokenization, self-attention, multi-head attention, transformer blocks, and text generation and all explained step-by-step with a simple nursery rhyme corpus.

  • Updated Dec 16, 2025
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the gpt-training topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the gpt-training topic, visit your repo's landing page and select "manage topics."

Learn more