This repository contains a complete Deep Learning pipeline for classifying flower species with high precision. The project demonstrates the power of Transfer Learning and Fine-Tuning using the MobileNetV2 architecture to build a robust classifier even with limited data.
The model is trained on a dataset containing 5 classes:
- Daisy
- Roses
- Sunflowers
- Tulips
- Dandelion
This repository presents a complete end-to-end deep learning pipeline for high-accuracy flower species classification.
The project leverages transfer learning and fine-tuning with the MobileNetV2 architecture to build a robust image classifier, even with a relatively limited dataset.
The implementation demonstrates best practices in data pipeline optimization, model training strategy, and performance evaluation.
The model is trained on a multi-class image dataset containing five flower categories:
- Daisy
- Roses
- Sunflowers
- Tulips
- Dandelion
- Built using
tf.data.Datasetfor scalable and high-performance data loading - Includes image decoding, preprocessing, batching, and prefetching
- Eliminates I/O bottlenecks during training
- Interactive visualizations using Plotly
- Class distribution and dataset balance analysis
- Feature Extractor: MobileNetV2 (pretrained on ImageNet)
- Custom Classification Head
- GlobalAveragePooling2D
- Dense output layer (Softmax)
Phase 1 — Feature Extraction
- Base model frozen
- Training classification head only
Phase 2 — Fine-Tuning
- Top layers of MobileNetV2 unfrozen
- Very low learning rate
- Improved domain-specific feature learning
The two-stage training strategy significantly improved generalization and predictive performance.
| Metric | Training | Validation |
|---|---|---|
| Final Accuracy | ~99% | ~91% |
| Final Loss | ~0.05 | ~0.25 |
✔ Smooth convergence of training and validation loss
✔ Stable accuracy improvement across epochs
✔ No severe overfitting after fine-tuning
✔ Strong generalization on unseen images
# Architecture Summary
- Input (224, 224, 3)
- MobileNetV2 (Base)
- GlobalAveragePooling2D
- Dense (128 units, ReLu )
- Dense (5 units, Softmax)-
Demonstrate practical transfer learning workflow
-
Implement optimized TensorFlow input pipelines
-
Apply structured fine-tuning strategy
-
Provide transparent evaluation and visualization
- Environment: Open the
.ipynbfile in Google Colab or any Jupyter environment. - Dependencies: Ensure you have
tensorflow,plotly,pandas, andmatplotlibinstalled. - Data: The notebook is designed to work with images stored in Google Drive; update the
drive.mountpath.
Samir Mohamed , AI and Computer vision Engineer .