An Automatic Gradient Engine (Autograd) that implements backpropagation (reverse-mode autodifferentiation) over a Directed Acyclic Graph (DAG), and a small API based on PyTorch.
Based on the tutorial from Andrej Karpathy: The spelled-out intro to neural networks and backpropagation: building micrograd
This project uses uv to manage packages and virtual environments.
# Navigate to current directory
cd micrograd-from-scratch
# Sync dependencies and initialize the environment
uv syncCheck out the notebooks/ directory for detailed lecture notes and step-by-step code walkthroughs. To test the engine locally with a sample gradient descent loop, run:
uv run python src/test/test.py- Scalar-valued Autograd: Supports fundamental operations including addition, multiplication, power, and activation functions like tanh and exp.
- Neural Network Library: Provides
Neuron,Layer, andMLPclasses for building and training modular neural networks.
-
notebooks/micrograd_notebook.ipynb: Notes on fundamentals of backpropagation and building an autograd engine from scratch, focusing on the Value class and manual gradient calculations. -
notebooks/mlp_implementation.ipynb: Builds upon the autograd engine to implement a full neural network library, covering the construction of neurons, layers, and a multi-layer perceptron (MLP) for binary classification. -
src/micrograd/engine.py: CoreValueclass and backpropagation logic. -
src/micrograd/nn.py: Neural network implementation includingNeuron,Layer, andMLP. -
src/test/test.py: Local test suite for verifying model convergence.
