-
Notifications
You must be signed in to change notification settings - Fork 28
Get Started
Create the object:
GestureVariationFollower* myGvf = new GestureVariationFollower(ns, sigs, icov, resThresh, nu);
For each new template to be added to the vocabulary, first call:
myGvf->addTemplate();
The gesture data corresponding to the template N that is currently learned, call:
myGvf->fillTemplate(N, data);
When at least 1 gesture has been added in the vocabulary, you can start following. At the beginning of each new gesture to be recognized, restart the GVF by spreading particles:
myGvf->spreadParticles(means, ranges);
For each new gesture sample:
myGvf->infer(sample);
Once the inference performed for the current sample, the recognize gesture together with the adapted variations can be obtained with:
M = myGvf->getEstimatedStatus();
Parameters
GestureVariationFollower* myGvf = new GestureVariationFollower(ns, sigs, icov, resThresh, nu);
-
ns
: Number of particles (~200/template) -
sigs
: Incremental step length for each feature. In other words, for instance allowing large variation in scale would lead to put its sig value to 0.01, on the contrary constraining the evolving scaling would lead to a sig value0.0000001
.sigs
is a vector of the size of the variation features to be estimated (typically 4). -
icov
: e.g. value at 1/(0.04) for data between 0 and 1 -
resThresh
= ns/5 -
nu
= 0.
myGvf->spreadParticles(means, ranges);
(particles are spread uniformly onto each feature (phase, speed, scale and angle))
-
means
: Mean values for spreading. For instance, if considering that at the beginning the live gesture would start at the original speed before varying, the mean value for the speed feature will be1
-
ranges
: Range of spreading.