This is a small, educational implementation of reverse-mode automatic differentiation (autograd) inspired by Andrej Karpathy's "micrograd" / lecture material. The project lives in the Jupyter notebook autograd.ipynb and demonstrates a minimal Value class that builds a computation graph, computes gradients via backpropagation, visualizes the graph with Graphviz, and uses the primitives to implement a tiny MLP and a simple training loop.
- Minimal
Valuescalar class with data, grad and backward capability - Elementary operators (+, -, *, /, pow), tanh and exp, and backward propagation
- Graph tracing and SVG visualization using Graphviz
- Simple
Neuron,Layer, andMLPclasses that useValueobjects - Example training loop showing gradient descent on a tiny dataset
- Python 3.8 or higher
- pip packages:
numpy,matplotlib,graphviz,jupyter - Graphviz system binary (required to render the computation graph images)
On Windows you can install the Python packages and (optionally) the Graphviz binary as follows:
PowerShell (recommended):
# create and activate a virtual environment
python -m venv .venv
.\.venv\Scripts\Activate.ps1
# install required Python packages
pip install numpy matplotlib graphviz jupyter
# (optional) install Graphviz system binary if you don't have it
# If you use Chocolatey:
choco install graphviz
# Otherwise download the installer from https://graphviz.org/download/ and add Graphviz's bin/ folder to your PATHIf you prefer using a requirements.txt, create one with:
numpy
matplotlib
graphviz
jupyter
and then run pip install -r requirements.txt after activating your venv.
- Activate your virtual environment (see above).
- From the repository folder, start Jupyter:
jupyter notebook autograd.ipynb- Run the notebook cells in order. The notebook contains:
- Small examples of functions and plotting
- The
Valueclass implementation (autodiff primitives) - Graph tracing and drawing (
draw_dot) usinggraphvizPython package - A small MLP example and a short training loop showing gradient updates
Key notebook sections (top-to-bottom):
- Basic function example and plotting using NumPy and Matplotlib
Valueclass implementation: data, grad, ops andbackward()- Graph tracing and
draw_dotfunction (Graphviz based visualization) - Simple forward example showcasing
tanhandexpvariants - Definitions of
Neuron,Layer, andMLP - Tiny dataset and a short training loop demonstrating loss, backward, and parameter updates
The notebook shows a training loop similar to the following (run inside the notebook):
# forward pass
ypred = [n(x) for x in xs]
loss = sum((yout - ygt)**2 for ygt, yout in zip(ys, ypred))
# backward pass (zero grads first)
for p in n.parameters():
p.grad = 0.0
loss.backward()
# SGD update
for p in n.parameters():
p.data += -0.1 * p.gradImplementation and pedagogy heavily inspired by Andrej Karpathy's micrograd / autograd lecture material. Thanks to those resources for the clear, educational explanation.