Skip to content

This project involves a smart glove designed for real-time sign language translation. Equipped with flex sensors and an MPU-6050 module, it captures hand gestures and translates them into text and speech using Bi-LSTM.

Notifications You must be signed in to change notification settings

mahekupadhye31/Deep-learning-enabled-smart-glove

Repository files navigation

πŸ–οΈ Gesture Vocalizer – Bridging Communication Through Technology

Arduino Python TensorFlow License: MIT

πŸ“˜ Overview

The Gesture Vocalizer project empowers individuals with speech and hearing impairments by translating hand gestures into speech or text output.
By combining flex sensors, accelerometers, and deep learning models, this wearable system bridges the communication gap β€” enabling real-time gesture-to-voice translation.

Hardware
Fig 1. Glove

🧠 Proposed Design

The design phase focuses on translating the identified problem into a functional, wearable prototype that captures, processes, and vocalizes gestures.

Flow
Fig 2. Proposed Flow

🧩 Hardware Design

1. Arduino Mega Microcontroller

  • Acts as the computational hub with multiple I/O pins.
  • Handles real-time sensor integration and data transmission.

2. Flex Sensors

  • Five sensors attached to a glove detect finger bending through resistance changes.
  • Provide continuous analog data reflecting hand posture.

3. MPU-6050 Accelerometer

  • Measures acceleration and angular velocity to capture spatial orientation.
  • Enhances motion precision and dynamic gesture recognition.

βš™οΈ Hardware Design Workflow

  1. Sensor Integration: Connect flex sensors to Arduino analog pins.
  2. MPU-6050 Integration: Interface via I2C to capture acceleration and rotation.
  3. Power Supply: Powered through USB to ensure stable operation.
  4. Encapsulation: Mounted on a glove to maintain sensor stability and comfort.

πŸ’» Software Design

1. Neural Network

  • Implements a Bi-directional LSTM for gesture recognition.
  • Processes sequences of sensor readings to identify dynamic and static gestures.
  • Trained using a custom dataset for enhanced accuracy.

2. Custom Dataset

  • Captures 3-second intervals of sensor data per gesture.
  • Combines flex sensor and MPU-6050 outputs into a single input vector.
  • Stored in CSV format for efficient model training and testing.

3. User Interface

  • A simple Python-based UI displays recognized gestures in real time.
  • Enables recording of new gesture data and live model inference.

User Interface
Fig 3. User Interface

User Interface
Fig 4. User Interface

User Interface
Fig 5. User Interface

βš™οΈ Implementation

πŸ”§ Algorithm Overview

Initialization

  • Define analog pins for flex sensors (A0–A4).
  • Initialize MPU6050 and set up I2C communication.

Setup Function

  • Begin serial communication and initialize all sensors.
  • Configure pin modes for flex sensors.

Loop Function

  • Read analog flex sensor data.
  • Capture accelerometer and gyroscope readings.
  • Print all data to the serial monitor for real-time observation.
#include <Wire.h>
#include <MPU6050.h>

MPU6050 mpu;
int flexPins[5] = {A0, A1, A2, A3, A4};

void setup() {
  Serial.begin(9600);
  Wire.begin();
  mpu.initialize();
  for (int i = 0; i < 5; i++) pinMode(flexPins[i], INPUT);
}

void loop() {
  for (int i = 0; i < 5; i++) {
    int flexVal = analogRead(flexPins[i]);
    Serial.print(flexVal);
    Serial.print("\t");
  }
  int16_t ax, ay, az, gx, gy, gz;
  mpu.getMotion6(&ax, &ay, &az, &gx, &gy, &gz);
  Serial.println();
  delay(100);
}

🧾 Results and Readings

Gesture Flex1 Flex2 Flex3 Flex4 Flex5 Ax Ay Az Predicted Output
Hello 612 589 578 560 533 112 85 75 Hello
Thank You 604 591 576 569 540 120 90 80 Thank You
Help 620 580 570 562 530 118 86 78 Help

Model Performance Comparison Across Algorithms

comparison

πŸ› οΈ Tech Stack

Category Technologies
Hardware Arduino Mega, Flex Sensors, MPU-6050
Programming Python, C++
Machine Learning TensorFlow, LSTM
Data Handling Pandas, NumPy
UI / Visualization HTML/CSS

⚑ Installation & Setup

1. Clone the Repository

git clone https://github.com/username/gesture-vocalizer.git
cd gesture-vocalizer

2. Install Python Dependencies

pip install -r requirements.txt

3. Upload Arduino Script

Use the Arduino IDE to upload the .ino file to your Arduino Mega board.

4. Run the Application

python app.py

5. Interact

Perform gestures wearing the glove β€” recognized gestures will appear in the UI and be vocalized.

πŸ“Š Performance Summary

Metric Value
Model accuracy 84.59%
Detection Latency < 0.5 seconds
Dataset Scale 3-second windows Γ— 5 sensors Γ— N gestures

πŸ† Achievements

  • Our research was published in the Journal of Electrical Systems, Vol. 20, No. 10s (2024).
  • We obtained a copyright for our novel dataset.

🀝 Contributors

About

This project involves a smart glove designed for real-time sign language translation. Equipped with flex sensors and an MPU-6050 module, it captures hand gestures and translates them into text and speech using Bi-LSTM.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published