This project explores optimization techniques in deep learning using a Multi-Layer Perceptron (MLP) for image classification.
- Backpropagation
- Learning Rate Scheduling
- Reduce on Plateau
- Adam Optimizer
Intel Image Classification Dataset: https://www.kaggle.com/datasets/puneet6060/intel-image-classification
VisionOptim/ ├── dataset/ (ignored) ├── visuals/ ├── notebook/
All training graphs are saved in the visuals/ folder.
- Optimization techniques significantly impact convergence
- Adam outperforms traditional gradient descent
- Learning rate tuning is critical in deep learning
- Python
- NumPy
- OpenCV
- Matplotlib
MIT License