Skip to content

Youssef-Bahaa/Micro-Grad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Micro-Grad

Micro-Grad is automatic differentiation (autograd) engine written from scratch in Python. It provides the fundamental building blocks for creating and training simple neural networks, offering a clear view into the mechanics of backpropagation. This project is inspired by Andrej Karpathy's micrograd.

Features

  • Scalar Autograd Engine: Implements a Value object that tracks gradients for scalar values.
  • Dynamic Computation Graph: Automatically builds a graph of operations for backpropagation.
  • Neural Network Primitives: Includes Neuron, Layer, and MLP classes to construct simple feed-forward networks.
  • Optimization: Provides an SGD optimizer and a StepLRScheduler for learning rate decay.

Core Components

  • engine.py: The heart of the library, containing the Value class. It overloads standard arithmetic operations (+, *, -, /) to build a computation graph. The backward() method performs a topological sort on this graph to compute gradients for all nodes using the chain rule.
  • neuron.py, layer.py, MLP.py: These files define the neural network architecture.
    • A Neuron has weights and a bias.
    • A Layer is a collection of neurons.
    • An MLP (Multi-Layer Perceptron) stacks multiple layers to form a network.
  • optimizer.py & LRScheduler.py: These handle the model's weight updates. The SGD optimizer adjusts parameters based on their gradients, and StepLRScheduler can be used to adapt the learning rate during training.
  • loss_functions.py: Contains loss functions like MSELoss (Mean Squared Error) to evaluate model performance.

References

  • Andrej Karpathy, Micrograd: A tiny autograd engine in PythonYouTube

About

Micro-Grad: Simple neural network and autograd engine written from scratch in Python.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages