This project implements a fully connected neural network from scratch in Python using NumPy, trained to classify handwritten digits from the MNIST dataset.
- Data preprocessing: Normalizes pixel values and one-hot encodes labels.
- Custom neural network framework: Implements
Linearlayers, activation functions (ReLU,Sigmoid,SoftMax), and loss functions (MSE,NLL). - Forward and backward propagation: Manual computation of gradients and weight updates.
- Training with SGD: Stochastic Gradient Descent with mini-batch support.
- Evaluation: Computes accuracy on the development set during training.
- Modular design: All layers and losses are implemented as reusable modules in a
Sequentialcontainer. - Flexible loss selection: Choose
MSEfor regression orNLL(cross-entropy) for classification tasks.