Trash Optimizer is a two-week project developed during Le Wagon AI & Data Science Bootcamp (Nantes 2025 Q4 batch) by a team of four students:
- Paul Baudry
- Charles Poisson
- Simon Hingant
- Daria Serbichenko
Trash Optimizer addresses two key challenges in waste management: identifying the correct waste category and locating the nearest appropriate disposal facility. This application provides an end-to-end solution for waste classification and route optimization.
- Image-Based Classification: Classifies waste items into 18 categories aligned with Nantes Metropole's waste management taxonomy
- Optimized Route Planning: Generates the shortest path to relevant collection points based on user location and waste type
- Geospatial Integration: Consolidates collection point data from multiple sources into a unified database
Machine Learning Model
- Fine-tuned EfficientNetB0 architecture for waste classification
- Trained on 3 labeled datasets comprising 18 waste categories
- Achieves 90% classification accuracy
- Extended from base model's 5 relevant categories to full 18-category taxonomy
Data Infrastructure
- Aggregated data from multiple sources: CSV files, Open Data APIs, web scraping, and geoservice APIs
- Consolidated and transformed data pipeline feeding into BigQuery
- Centralized data warehouse serving as single source of truth for collection points
Route Optimization
- Batch queries collection points from database based on waste classification
- Calculates inter-point distances using geospatial algorithms
- Generates optimized routes and provides turn-by-turn navigation
Training Datasets
- RealWaste: joebeachcapital/realwaste - UCI Machine Learning Repository (DOI)
- Recyclable and Household Waste Classification: alistairking/recyclable-and-household-waste-classification
- Custom dataset for specialized categories (batteries, light bulbs, electronics, etc.)
Collection Points Data
- Python 3.12+
- Docker (for deployment)
- API keys: Hugging Face, OpenRouteService, Google Cloud Platform
git clone https://github.com/yourusername/trash-optimizer.git
cd trash-optimizer
# Setup individual components (see component READMEs for details)
cd training # Model training
cd inference # FastAPI backend
cd webapp # Streamlit frontendEach component has its own .env.template file - copy to .env and configure with your API keys.
cd deployment
./setup.sh
docker-compose up --buildSee component READMEs for detailed setup instructions.
graph TB
subgraph "User Interface"
A[Streamlit Frontend]
end
subgraph "Backend Services"
B[FastAPI Inference Service]
C[BigQuery Database]
end
subgraph "External Services"
D[OpenRouteService API]
E[Hugging Face Hub]
end
subgraph "Training Pipeline"
F[Dataset Builder]
G[Model Training]
H[Kaggle Datasets]
end
A -->|Image Upload| B
B -->|Download Model| E
A -->|Query Collection Points| C
A -->|Route Optimization| D
B -->|Classification Result| A
H -->|Download| F
F -->|Processed Dataset| G
G -->|Upload Trained Model| E
style A fill:#e1f5ff
style B fill:#ffe1e1
style C fill:#e1ffe1
style F fill:#fff4e1
style G fill:#fff4e1
Data Flow:
- User uploads waste image → Inference API classifies using model from Hugging Face Hub
- Classification result + user location → Query BigQuery for relevant collection points
- Collection points → OpenRouteService calculates optimal route → Display to user
- Training Module: Fine-tune classification models on custom waste datasets
- Inference Backend: FastAPI service for waste classification
- Webapp Frontend: Streamlit interface for classification and route optimization
- Deployment: Docker containerization and cloud deployment guides
Models are hosted on Hugging Face Hub: https://huggingface.co/cpoisson/trash-optimizer-models
For a local or production deployment, use Docker to run both the inference backend and webapp in a single container:
cd deployment
./setup.sh # Initial setup
docker-compose up --build # Build and runSee deployment/README.md for complete deployment documentation.
- Collection point data is currently specific to Nantes Metropole area
- Model trained primarily on household waste items
This project was developed as an educational bootcamp project at Le Wagon. We welcome feedback, bug reports, and suggestions via issues.
- Le Wagon - AI & Data Science Bootcamp (November 2025)
- Dataset Contributors: RealWaste (UCI), Kaggle community datasets
- Nantes Metropole - Open data initiative for collection points
- External Services: Hugging Face Hub, OpenRouteService, Google BigQuery
This project is licensed under the MIT License
