- A Complete system to control the drone using hand gestures. The following video shows the result of this research:
- The proposed system consists of three modules:
- Hand Detector: SSD deep neural network detector is used to recognize and localize the hands. The dataset was collected and labelled for this project. It contains ~3200 samples acquired in outdoor and indoor enviromnents with one and two hands. The dataset is available in this repository HandsDataset
- Gestures Recognizer: Image processing algorithm is developed to recognize the gestures. The user can contol the drone in a similar way he drives the car using virtual wheel. The arm and takeoff gesture is shown in the following figure.
Some samples of the movement gestures:
- Drone Controller: Ardupilot system is used in this module. A simple system is built using dronekit library and MAVLink messages.
- Research paper of this work is under review.
- Thesis of this work will be submitted to University of Oklahoma at the end of November and will be available online.
- The system is tested with CPU and works in real-time.
- Everyone is welcome to contribute 🍬 🍩 🍨 .
- Star the repository ⭐ 😉.
- Installation instructions.
- Upload dataset.
- Upload trained model.
- Autpilot repo.
- GUI.
- Thesis link 📘 .
- Paper link 📄.
If you want to cite this work, use the following:
Soubhi Hadri, Hand Gestures for Drone Control Using Deep Learning, GitHub repository, https://github.com/SubhiH/HandGesturesDroneController
@misc{Soubhi2018,
author = {Soubhi, Hadri},
title = {Hand Gestures for Drone Control Using Deep Learning },
year = {2018},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/SubhiH/hand_gesture_controller}}
}
Special thanks to:
You can find the detailed list of resources at the end of the thesis.
HandGesturesDroneController is released under the Apache license. See LICENSE for more information.