Interactive Objects Learning
- YARP (with
LUA
bindings compiled) - iCub
- icub-contrib-common
- OpenCV (
3.0.0
or higher with tracking features enabled)- Download
OpenCV
:git clone https://github.com/Itseez/opencv.git
. - Checkout the correct branch: e.g.
git checkout 3.0.0
. - Download the external modules:
git clone https://github.com/Itseez/opencv_contrib.git
. - Checkout the same branch: e.g.
git checkout 3.0.0
. - Configure
OpenCV
by filling inOPENCV_EXTRA_MODULES_PATH
with the path toopencv_contrib/modules
and then toggling onBUILD_opencv_tracking
. - Compile
OpenCV
.
- Download
- LUA
- rFSM (just clone it, we don't need to compile)
- segmentation
- Hierarchical Image Representation
- stereo-vision
- speech
- Multiple Instance Boosting (optional, required to compile
milClassifier
)
Remember to export the environment variable LUA_PATH
with paths to lua scripts
located in rFSM
and iol
directories and put them also in the PATH
.
Example:
export LUA_PATH=";;;/path_to/rFSM/?.lua;$ICUBcontrib_DIR/share/ICUBcontrib/contexts/iol/lua/?.lua"
export PATH=$PATH:/path_to/rFSM/tools;$ICUBcontrib_DIR/share/ICUBcontrib/contexts/iol/lua
Online documentation is available here: http://robotology.github.com/iol.
A video showing the recognition and interaction capabilities achieved by means of IOL components can be seen here.
Material included here is Copyright of iCub Facility - Istituto Italiano di Tecnologia and is released under the terms of the GPL v2.0 or later. See the file LICENSE for details.