A cost-effective visual assistance solution for visually impaired individuals
Himba Vision is an Android-based assistive application designed to enhance contextual awareness for visually impaired individuals, enabling independent navigation. It leverages computer vision technologies to identify objects, read text, estimate distances, and provide spatial awareness—all in real time.
✅ Offline Object Detection using NanoDet
✅ Online Panoptic Segmentation with Detectron2’s Panoptic FPN
✅ Text Recognition & Speech Interaction 🗣️
✅ 3x3 Grid Localization for structured spatial awareness 📌
✅ Monocular Distance Estimation for object proximity detection 📏
✅ Google Maps Integration for geographical context 🗺️
1️ Load the project in Android Studio.
2️ Connect an actual Android device for deployment (❌ No emulators).
3️ Run the application.
- If you encounter issues running the app, locate the APK file in the
releasedirectory and install it manually.
To enable panoptic segmentation, follow these steps:
🔹 Create a premium Ngrok account.
🔹 Load the Jupyter Notebook found in the server-side logic.
🔹 Replace the existing Ngrok key with your premium Ngrok key.
🔹 This will expose the Kaggle/Colab notebook hardware resources to the app for processing.
🔹 Ensure you have the necessary dependencies installed in your environment.
🔹 A stable internet connection is required for seamless communication between the app and cloud resources for panoptic segmentation.
🔹 If everything fails, contact me on success_fred@yahoo.com to test panoptic segmentation.
🔹 Shadow Detection: Objects' shadows may sometimes be detected as actual objects.
🔹 Performance Variability: Latency and local processing speed depend on your device's hardware
#
