Kontaka is a mobile-based assistive application that recognizes Bangladeshi banknotes using deep learning and announces the denomination through voice output.
The system is designed to help visually impaired individuals independently identify paper currency using a smartphone camera.
- Real-time banknote recognition using device camera
- Lightweight MobileNetV2 deep learning model
- Offline inference using TensorFlow Lite
- Bangla voice feedback for visually impaired users
- No login or internet connection required
The system works through the following pipeline:
Camera Input → Image Preprocessing → Deep Learning Model (TFLite) → Prediction → Bangla Voice Output
A custom dataset was collected for training the model.
- Total Images: 17,836
- Currency Classes: 15
- Denominations: 9 Bangladeshi banknotes
The dataset contains images captured under different:
- Lighting conditions
- Orientations
- Backgrounds
The recognition model is based on MobileNetV2 with Transfer Learning.
Key characteristics:
- Lightweight architecture optimized for mobile devices
- High accuracy with low computational cost
- Converted to TensorFlow Lite for mobile deployment
Performance
- Validation Accuracy: 99.89%
- Average processing time: ~627 ms per frame
The trained model is integrated into a Flutter Android application.
Workflow:
- User opens the application
- Camera captures the banknote image
- Image is processed and passed to the TFLite model
- Predicted denomination is announced through Bangla voice output
You can download and test the application from the Releases section.
APK Download:
https://github.com/omarfaruk-k/kontaka/releases/tag/v1.0
This project is licensed under the Creative Commons Attribution–NonCommercial 4.0 International License (CC BY-NC 4.0).
You are free to:
- Use
- Share
- Modify
Under the following conditions:
- Proper attribution must be given.
- Commercial use is not permitted without explicit permission from the author.