Skip to content

MetaSynAI is an AI‑driven accessibility framework that enables seamless interaction through voice commands, hand gestures, and eye‑tracking, offering a modern and inclusive way to control web interfaces.

License

Notifications You must be signed in to change notification settings

dj-ayush/MetaSynAI

Repository files navigation

🎛️ MetaSynAI

HTML5 CSS3 JavaScript OpenCV Python License Stars

MetaSynAI is an AI-powered accessibility framework that combines voice commands, hand gestures, and eye-tracking for futuristic and inclusive user interaction.


✨ Features

  • 🗣️ Voice Assistant – Control applications via voice commands.
  • Hand Gesture Recognition – Perform tasks using hand movements.
  • 👁️ Eye-Tracking Navigation – Interact using eye gaze.
  • 🌐 Modern Web Interface – Sleek, responsive, and interactive UI.

🛠️ Tech Stack

  • Frontend: HTML, CSS, JavaScript
  • Backend: Python (Flask)
  • ML/AI: TensorFlow, OpenCV
  • Voice: SpeechRecognition API
  • Eye-Tracking: Dlib, OpenCV

📁 Folder Structure


MetaSynAI/
├── assets/
├── css/
├── js/
├── templates/
│   ├── index.html
│   ├── voice-assistant.html
│   ├── hand-gestures.html
│   └── eye-gaze.html
├── app.py
├── gesture\_zoom.py
├── voice-assistant-server.js
├── README.md

⚙️ Getting Started


### 1. Clone the repo
git clone https://github.com/dj-ayush/MetaSynAI.git
cd MetaSynAI

2. Create & activate a virtual environment

python -m venv venv
# On Windows
venv\Scripts\activate
# On Mac/Linux
source venv/bin/activate

3. Install dependencies

pip install -r requirements.txt

4. Run the app

python app.py

5. Open in browser

http://localhost:5000

🤝 Contributing

We welcome contributions!

  1. Fork the repo
  2. Create a new branch: git checkout -b feature-name
  3. Commit your changes: git commit -m "Added feature"
  4. Push to your branch: git push origin feature-name
  5. Create a pull request 🚀

📄 License

This project is licensed under the MIT License.


Built with ❤️ by @dj-ayush

About

MetaSynAI is an AI‑driven accessibility framework that enables seamless interaction through voice commands, hand gestures, and eye‑tracking, offering a modern and inclusive way to control web interfaces.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •