Skip to content

KrisCowie/Sign-Language-Translation

Repository files navigation

Sign-Language-Translation

Convolutional Neural Networks for real time translation of American Sign Language in Python

  • Script constructs the model from the relevant dataset
  • Second script starts your webcam, and takes in each frame as an individual input, preprocesses it, and runs the model against it to predict the letter you are signing

This was a quick, 24 hr project - we had been working on NLP for the week, and the project prompt was to come up with a translator from say, French to English

I thought that sounded way too easy, so I thought what else we could translate - Sign Language!

Dataset Available here: https://www.kaggle.com/datamunge/sign-language-mnist

So here's a quick proof of concept to translate sign language in real time using your webcam and a CNN

Next Steps - using Labelbox' hand key framing to both label, and box more videos to expand the dataset

As well as record the signs being sent to the video

You could potentially use something similar to "translate" body language / human emotion, which could be useful for people on the autistic spectrum

I'd managed to label four different videos, and write the code in 24 hours. If you'd like access to the datasets the labeling created, let me know!

About

Translating Sign Language from your webcam in real time

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published