MetaSynAI is an AI-powered accessibility framework that combines voice commands, hand gestures, and eye-tracking for futuristic and inclusive user interaction.
- 🗣️ Voice Assistant – Control applications via voice commands.
- ✋ Hand Gesture Recognition – Perform tasks using hand movements.
- 👁️ Eye-Tracking Navigation – Interact using eye gaze.
- 🌐 Modern Web Interface – Sleek, responsive, and interactive UI.
- Frontend: HTML, CSS, JavaScript
- Backend: Python (Flask)
- ML/AI: TensorFlow, OpenCV
- Voice: SpeechRecognition API
- Eye-Tracking: Dlib, OpenCV
MetaSynAI/
├── assets/
├── css/
├── js/
├── templates/
│ ├── index.html
│ ├── voice-assistant.html
│ ├── hand-gestures.html
│ └── eye-gaze.html
├── app.py
├── gesture\_zoom.py
├── voice-assistant-server.js
├── README.md
⚙️ Getting Started
### 1. Clone the repo
git clone https://github.com/dj-ayush/MetaSynAI.git
cd MetaSynAI
python -m venv venv
# On Windows
venv\Scripts\activate
# On Mac/Linux
source venv/bin/activate
pip install -r requirements.txt
python app.py
http://localhost:5000
We welcome contributions!
- Fork the repo
- Create a new branch:
git checkout -b feature-name
- Commit your changes:
git commit -m "Added feature"
- Push to your branch:
git push origin feature-name
- Create a pull request 🚀
This project is licensed under the MIT License.
Built with ❤️ by @dj-ayush