This project demonstrates a structural pipeline for detecting hand gestures (flute finger positions), mapping them to musical notations (Sa, Re, Ga, Ma...), and storing the detected notes into a PostgreSQL database through Kafka streaming.
Camera → Mediapipe (ML Model) → Producer → Kafka → Consumer → PostgreSQL (Data Lake)
- Video Cam: Captures live hand gestures.
- Mediapipe (ML Model): Detects finger up/down positions and converts them into flute notes.
- Producer: Sends notes into Kafka topic.
- Kafka + Zookeeper: Message broker layer.
- Consumer: Listens to Kafka topic and stores the notes into PostgreSQL.
- PostgreSQL: Acts as a storage/data lake for all detected notations.
- Docker & Docker Compose (for Kafka, Zookeeper, PostgreSQL)
- Python 3.8+
- Webcam
git clone https://github.com/PRATHAMdeshkar/FluteNotation
cd HandRecInstall Python Dependencies
pip install -r requirements.txt
$ docker compose -f docker-compose.yml up -d
$ docker pspython consumer.py
python handGesture.py
Opens webcam. Detects hand gestures → maps to notes → pushes to Kafka.
Enter Postgres container:
$ docker exec -it 845d54e95831 psql -U postgres -d flute
>>Inside DB>># SELECT * FROM notes;