This project classifies hand gestures following the American Sign Language (ASL) standard using MediaPipe (Python version).
It is an application designed to recognize Hand Gestures (Sign Language) through a simple MLP using detected keypoints.
❗️ ️Modified version of the original repository. ❗
❗️ ️English-translated version available at translated repository. ❗
This application is deployed locally using Flask.
To run the application:
python app.py
- Insert: Save the predicted letters to form sentences.
- Space: Add a space between words.
- Delete: Remove incorrect inputs.
- Export: Export the saved sentence to speech using the
text-to-speech
library.
The web application allows users to save predictions as text and then export them as speech output. This feature enables better accessibility and practical use in real-time scenarios.
A training script for hand sign recognition.
A training script for finger gesture recognition.
Contains files related to hand sign recognition:
- Training data:
keypoint.csv
- Trained model:
keypoint_classifier.tflite
- Label data:
keypoint_classifier_label.csv
- Inference module:
keypoint_classifier.py
Contains files related to finger movement recognition:
- Training data:
point_history.csv
- Trained model:
point_history_classifier.tflite
- Label data:
point_history_classifier_label.csv
- Inference module:
point_history_classifier.py
A utility module for calculating FPS.
Key point coordinates are derived through a 4-step preprocessing pipeline:
The architecture for the model prepared in [keypoint_classification.ipynb](keypoint_classification.ipynb)
is:
Point history coordinates are derived through a 4-step preprocessing pipeline:
- Kazuhito Takahashi (Twitter)
- Nikita Kiselov (GitHub)
This project is licensed under the Apache v2 license.