Skip to content

NativeSensors/EyeGesturesLite

Repository files navigation

EYEGESTURES

made in PL

EyeGestures is open source eyetracking software/library using native webcams and phone camers for achieving its goal. The aim of library is to bring accessibility of eye-tracking and eye-driven interfaces without requirement of obtaining expensive hardware.

Our Mission!

EyeGesturesLite

EyeGesturesLite is JavaScript implementation of EyeGestures algoritm. If you need python version, check original repository.

How does it work?

It is a gaze tracker that uses machine learning and built-in cameras (such as a webcam) to provide gaze tracking for the user. It includes a built-in calibration mode that displays 20 red circles for the user to focus on, along with a blue cursor that follows the user’s gaze. During the calibration process, the cursor will gradually start following user's gaze more and more. By the end of the 20 points, the cursor should be able to independently follow the user’s gaze.

⚙️ Try:

EyeGesturesLite

🔧 Build your own:

CDN:

  1. External dependencies CDNs:
<script src="https://www.lactame.com/lib/ml/6.0.0/ml.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/mathjs/11.8.0/math.min.js"></script>
  1. You need two CDN links:
<link rel="stylesheet" href="https://eyegestures.com/eyegestures.css">
<script src="https://eyegestures.com/eyegestures.js"></script>
  1. Place video element (which can be hidden) somewhere in the page together with status and error divs (can stay invisible):
<video id="video" width="640" height="480" autoplay style="display: none;"></video>
<div id="status" style="display : none;">Initializing...</div>
<div id="error" style="display : none;"></div>
  1. Then javascript interface code:
<script>

function onPoint(point,calibration){
    point[0]; // x
    point[1]; // y
    calibration; // true - for calibrated data, false if calibration is ongoing
};

const gestures = new EyeGestures('video',onPoint);
// gestures.invisible(); // to disable blue tracker
gestures.start();
</script>

NPM package [WiP]:

  1. Install npm package:
npm install eyegestures
  1. Place video element in your DOM (which can be hidden) somewhere in the page together with status and error divs (can stay invisible):
<video id="video_element_id" width="640" height="480" autoplay style="display: none;"></video>
<div id="status" style="display : none;">Initializing...</div>
<div id="error" style="display : none;"></div>
  1. try to import:
import EyeGestures from 'eyegestures'; 

const gestures = new EyeGestures("video_element_id",(point,calibration)=>{/*point,calibration*/})
// after this call: gestures.start();
console.log(gestures);

Warning

EyeGestures needs DOM to operate and its constructor expects to receive video element/camera feed id string.

rules of using

You can use it free of charge as long as you keep our logo. If you want to remove logo then contact: contact@eyegestures.com.

📇 Find us:

Troubleshooting:

💻 Contributors

💵 Support the project

We will be extremely grateful for your support: it helps to keep server running + fuels our brains with coffee.

Support project on Polar (in exchange we provide access to alphas versions!):

Subscribe on Polar

Star History Chart

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published