Skip to content

Akstrov/autograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

autograd — tiny autodiff (Karpathy-style)

This is a small, educational implementation of reverse-mode automatic differentiation (autograd) inspired by Andrej Karpathy's "micrograd" / lecture material. The project lives in the Jupyter notebook autograd.ipynb and demonstrates a minimal Value class that builds a computation graph, computes gradients via backpropagation, visualizes the graph with Graphviz, and uses the primitives to implement a tiny MLP and a simple training loop.

Features

  • Minimal Value scalar class with data, grad and backward capability
  • Elementary operators (+, -, *, /, pow), tanh and exp, and backward propagation
  • Graph tracing and SVG visualization using Graphviz
  • Simple Neuron, Layer, and MLP classes that use Value objects
  • Example training loop showing gradient descent on a tiny dataset

Requirements

  • Python 3.8 or higher
  • pip packages: numpy, matplotlib, graphviz, jupyter
  • Graphviz system binary (required to render the computation graph images)

On Windows you can install the Python packages and (optionally) the Graphviz binary as follows:

PowerShell (recommended):

# create and activate a virtual environment
python -m venv .venv
.\.venv\Scripts\Activate.ps1

# install required Python packages
pip install numpy matplotlib graphviz jupyter

# (optional) install Graphviz system binary if you don't have it
# If you use Chocolatey:
choco install graphviz
# Otherwise download the installer from https://graphviz.org/download/ and add Graphviz's bin/ folder to your PATH

If you prefer using a requirements.txt, create one with:

numpy
matplotlib
graphviz
jupyter

and then run pip install -r requirements.txt after activating your venv.

Quick start

  1. Activate your virtual environment (see above).
  2. From the repository folder, start Jupyter:
jupyter notebook autograd.ipynb
  1. Run the notebook cells in order. The notebook contains:
    • Small examples of functions and plotting
    • The Value class implementation (autodiff primitives)
    • Graph tracing and drawing (draw_dot) using graphviz Python package
    • A small MLP example and a short training loop showing gradient updates

Notebook overview

Key notebook sections (top-to-bottom):

  • Basic function example and plotting using NumPy and Matplotlib
  • Value class implementation: data, grad, ops and backward()
  • Graph tracing and draw_dot function (Graphviz based visualization)
  • Simple forward example showcasing tanh and exp variants
  • Definitions of Neuron, Layer, and MLP
  • Tiny dataset and a short training loop demonstrating loss, backward, and parameter updates

Example (from the notebook)

The notebook shows a training loop similar to the following (run inside the notebook):

# forward pass
ypred = [n(x) for x in xs]
loss = sum((yout - ygt)**2 for ygt, yout in zip(ys, ypred))

# backward pass (zero grads first)
for p in n.parameters():
    p.grad = 0.0
loss.backward()

# SGD update
for p in n.parameters():
    p.data += -0.1 * p.grad

Credits

Implementation and pedagogy heavily inspired by Andrej Karpathy's micrograd / autograd lecture material. Thanks to those resources for the clear, educational explanation.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published