An intelligent waste classification and sorting system using machine learning, computer vision, and robotic automation. This project automatically identifies different types of waste materials and sorts them into appropriate bins using stepper motors and servo mechanisms.
This system uses AI-powered image classification to identify waste materials (paper, plastic, metal, glass, cardboard, biodegradable) and automatically sorts them into different compartments using mechanical actuators. The project is designed to run on a Raspberry Pi with various sensors and motors.
- Raspberry Pi (Main controller)
- Camera Module (Image capture for classification)
- Touch Sensor (User interaction trigger)
- Ultrasonic Sensor (Object detection)
- Stepper Motors (Rotation mechanism for sorting)
- Servo Motors (Gate/door control)
- LCD Display (Status and feedback display)
- LEDs (Status indicators)
- TensorFlow Lite Model (
best_float32.tflite) - Optimized for edge deployment - YOLO Model (
best.pt) - Object detection and classification - Quantized Model (
quantized_model.tflite) - Further optimized for performance
ML Training Repository: For model training and development details, see Farz ML Repository
- Paper (Quadrant 1)
- Metal (Quadrant 2)
- Plastic (Quadrant 3)
- Glass
- Cardboard
- Biodegradable
- General/Other (Quadrant 4)
iquit.py/iquit2.py- Main application with touch sensor integrationmlbins_lcd.py- Version with LCD display supportmlbins2.py- Alternative main applicationliteAi.py- Standalone TensorFlow Lite inference
stepper.py/stepper2.py/stepperNew.py- Stepper motor controlstepperAI.py/stepperAI2.py- AI-integrated stepper controlmotor.py- General motor control functionssensor.py- Sensor input handlingultrasonic.py- Ultrasonic sensor for object detectionbutton.py- Button/touch interface
demo_lcd.py- LCD display demonstrationyolocheck.py- YOLO model testingscripts.py- General utility scriptscheck.py/checker2.py/Checker3.py- System testing utilities
# Install Python dependencies
pip install -r requirements.txt # Note: Create from farzpackagesvenv.txt
# Required Python packages include:
# - opencv-python
# - tensorflow
# - numpy
# - RPi.GPIO
# - gpiozero
# - pillow
# - ultralytics- Connect camera module to Raspberry Pi
- Wire stepper motors to GPIO pins (17, 18, 27, 22)
- Connect servo motors to designated GPIO pins
- Set up touch sensor on GPIO pin 12
- Connect ultrasonic sensor (Trigger: GPIO 19, Echo: GPIO 4)
- Wire LCD display (if using LCD version)
# Clone the repository
git clone https://github.com/imrun10/farz.git
cd farz
# Set up virtual environment (optional but recommended)
python -m venv venv
source venv/bin/activate # On Linux/Mac
# or
venv\Scripts\activate # On Windows
# Install dependencies
pip install -r requirements.txt# Run the main application with touch sensor
python iquit2.py
# Run with LCD display
python mlbins_lcd.py
# Test individual components
python yolocheck.py # Test YOLO detection
python demo_lcd.py # Test LCD display
python ultrasonic.py # Test ultrasonic sensor- Detection: Touch sensor or ultrasonic sensor detects object presence
- Capture: Camera captures image of the waste item
- Classification: AI model processes image and identifies waste type
- Sorting: System calculates appropriate bin quadrant
- Mechanical Action:
- Stepper motor rotates to correct position
- Servo motor opens gate/door
- Item is sorted into appropriate bin
- System returns to ready state
Each waste category has specific rotation durations:
# Duration values for initial rotation (seconds)
duration_values = {
1: 0.043, # Paper
2: 0.0650, # Metal
3: 0.08, # Plastic
4: 0.0098 # General
}
# Final positioning durations
final_duration_values = {
1: 0.059, # Paper
2: 0.03, # Metal
3: 0.018, # Plastic
4: 0.088 # General
}# Stepper Motor Pins
out1 = 17
out2 = 18
out3 = 27
out4 = 22
# Servo Control
servo_pin = 12
# Sensors
touch_pin = 12
ultrasonic_trigger = 19
ultrasonic_echo = 4- Input Size: 640x640 pixels
- Confidence Threshold: 0.3
- Supported Formats: JPG, PNG
- Inference Time: ~0.2-0.5 seconds on Raspberry Pi
# Test stepper motor
python stepper.py
# Test camera and AI inference
python liteAi.py
# Check system sensors
python check.py- Update the
classesdictionary in prediction functions - Add corresponding quadrant mappings
- Calibrate motor rotation values
- Update LCD display messages if applicable
- GPIO already in use: Run
GPIO.cleanup()or restart the Pi - Camera not found: Check camera module connection and enable camera in raspi-config
- Import errors: Ensure all dependencies are installed correctly
- Motor not moving: Verify GPIO pin connections and power supply
Most scripts include debug functions and verbose output for troubleshooting.
We welcome contributions! See the CONTRIBUTING file for details.
This project is licensed under the MIT License. See the LICENSE file for details.
- TensorFlow team for TensorFlow Lite
- Ultralytics for YOLO implementation
- Raspberry Pi Foundation
- Contributors to the RPi.GPIO and gpiozero libraries
For issues and questions:
- Check the troubleshooting section
- Review hardware connections
- Test individual components
- Create an issue in the repository
Note: This system requires proper hardware setup and calibration. Always test individual components before running the full system.