This project implements a simple neural network from scratch using NumPy. The network is trained on a spiral dataset to classify points into three classes. It includes fundamental components such as dense layers, activation functions, an optimizer (Adam), and a loss function (categorical cross-entropy).
⬇️ Check out how I used what I learned in this project to build something more complex and meaningful here https://github.com/bryson32/DeepLearning-LipReader
- Fully connected (dense) layers
- ReLU and Softmax activation functions
- Adam optimizer for weight updates
- Categorical cross-entropy loss
- Backpropagation for gradient calculation
- Simple training loop with accuracy tracking
Ensure you have the following dependencies installed:
pip install numpy matplotlib nnfsRun the script to train the neural network:
python core.pyThe model trains for 10,000 epochs and prints the loss and accuracy every 1,000 epochs.
Layer_Dense: Implements a fully connected layer.Activation_ReLU: Implements the ReLU activation function.Activation_Softmax: Implements the Softmax function.Loss_crossentropy: Computes categorical cross-entropy loss.Optimizer_Adam: Implements the Adam optimization algorithm.
- Generate spiral data using
nnfs.datasets.spiral_data(). - Pass data through two hidden layers with ReLU activations.
- Use Softmax for the output layer to produce class probabilities.
- Compute loss using categorical cross-entropy.
- Perform backpropagation to update weights using Adam optimizer.