Skip to content

Sign Language Translator for Alphabetical Characters 1.0

Notifications You must be signed in to change notification settings

qu-ngx/hand-read

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hand-Gesture-Reader

What does this code do?

  • Reading simple idle hand gestures in Sign Language such as alphabetical symbols using a pre-trained model.
    (I trained almost all gestures except for Z and J since they require motions)

IMG_2648

How does it work?

  • Well fairly easy since I have processed all input datas on my computer
  • You just have to put your hand in your camera. (Remember one hand only)

Manual guide how to run the codes:

Installing libraries and dependencies:

pip3 install open-cv tensorflow mediapipe numpy
  • Since I have trained model and converted images into readable form in model.p. (You do not have to train it manually)

Head toward zsh, cmd, or command prompt, terminal and run the code:

python3 inference.py


It should pop up a window like below and here comes the demonstration:
Note: The model still sometimes have low inaccuracy and some input overload so try again if it exits unfortunately (And it will do)

About

Sign Language Translator for Alphabetical Characters 1.0

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages