A simple implementation of a neural network for the MNIST handwritten digit classification task using PyTorch.
This project implements a basic neural network architecture to classify handwritten digits from the MNIST dataset. The network features:
- Two linear layers with ReLU activation in between
- Cross-entropy loss function with softmax
- PyTorch implementation
The neural network consists of:
- Input layer (784 neurons - flattened 28x28 MNIST images)
- First linear layer with ReLU activation
- Second linear layer
- Softmax layer (part of cross-entropy loss)
- Python 3.x
- PyTorch
- torchvision
- numpy
The project is implemented in a Jupyter notebook (MNIST_Simple_NN.ipynb). To run the project:
- Install the required dependencies
- Open the Jupyter notebook
- Run the cells sequentially to:
- Load and preprocess the MNIST dataset
- Create and train the neural network
- Evaluate the model's performance
The model demonstrates the basics of deep learning by achieving reasonable accuracy on the MNIST dataset, serving as a good starting point for understanding neural networks and PyTorch implementation.
This project is open source and available under the MIT License.