Skip to content

jacobdell/Nueral-Network-Library

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

MNIST Neural Network from Scratch

This project implements a fully connected neural network from scratch in Python using NumPy, trained to classify handwritten digits from the MNIST dataset.


Features

  • Data preprocessing: Normalizes pixel values and one-hot encodes labels.
  • Custom neural network framework: Implements Linear layers, activation functions (ReLU, Sigmoid, SoftMax), and loss functions (MSE, NLL).
  • Forward and backward propagation: Manual computation of gradients and weight updates.
  • Training with SGD: Stochastic Gradient Descent with mini-batch support.
  • Evaluation: Computes accuracy on the development set during training.
  • Modular design: All layers and losses are implemented as reusable modules in a Sequential container.
  • Flexible loss selection: Choose MSE for regression or NLL (cross-entropy) for classification tasks.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages