The Humanoid Sensing and Perception is within Istituto Italiano di Tecnologia.
Our group studies algorithms and technologies that allow robots to sense the environment and react appropriately. Our strategy is to exploit the capability of robots to learn under human guidance or from the interaction with the environment by exploiting multiple sources of information (e.g. proprioception, vision, touch, and audition). We work on visual and tactile perception, for robot navigation and object manipulation. We also develop software tools and study software integration methodologies for the development of complex behaviors.
Our platforms are the iCub and R1 humanoid robots, we focus on applications in the domain of service robotics.
Checkout our IIT's website.
Whenever possible we make code related to our research available to the community with open-source licenses.
- ⚙️ On-line detection and segmentation
- On-line Object Detection and Instance Segmentation project.
- ⚙️ ROFT
- Real-time Optical Flow-aided 6D Object Pose and Velocity Tracking.
- ⚙️ MASK-UKF
- Instance Segmentation Aided 6D Object Pose and Velocity Tracking using an Unscented Kalman Filter.
- ⚙️ Fast-YCB Dataset
- An annotated dataset for 6D object tracking, with fast moving YCB objects.
- ⚙️ Digit simulator for gazebo
- A tentative C++ wrapper for the Python based Digit tactile sensor simulation.
- ⚙️ Tracking sliding objects with tactile feedback
- A differentiable Extended Kalman Filter for object tracking under sliding regime
- 📚 Visualization for grasp candidates
- Barebones library to visualize simple manipulation environments
- ⚙️ Robot environment for pybullet
- A Python package that collects robotic environments based on the PyBullet simulator.
- ⚙️ GRASPA Benchmark
- A grasping benchmark for comparing grasping planners across different robot platforms
- ⚙️ YARP
- Yet Another Robot Platform - our middleware, check out official documentation page
- ⚙️ YCM
- Extra CMake Modules for YARP and Friends, check out official documentation page
- ⚙️ visual-tracking-control
- a suite of cross-platform applications for visual tracking and visual servoing for the humanoid robot platform iCub.
- ⚙️ navigation
- A collection of modules to perform 2D navigation with a YARP-based robot.
- ⚙️ Cardinal points grasping
- Simple superquadric-based grasping pose generator for iCub
- ⚙️ Superquadric fitting
- Solve an optimization problem to find out the best superquadric that fits a given partial point cloud.
- ⚙️ On the Fly recognition
- This demo allows to teach the iCub to visually recognize new objects "on the fly".
- ⚙️ https://github.com/robotology/himrep
- This repository contains a collection of modules to extract features from images or to perform classification tasks on feature vectors
- ⚙️ https://github.com/robotology/r1-grasping
- Grasping on R1 robot
- ⚙️ https://github.com/robotology/point-cloud-read
- Acquire point clouds of specific objects in the scene in order to save or stream them.
- ⚙️ https://github.com/robotology/superquadric-grasp-demo
- Object modeling and grasping with superquadrics and visual-servoing
- ⚙️ https://github.com/robotology/tactile-control
- Improve grasp stability using tactile feedback.
- 📚 superquadric-lib
- a Yarp-free library for computing and visualizing the superquadric representing an object and the relative grasping candidates for a generic robot.
- 📚 superimpose-mesh-lib
- an augmented-reality library to superimpose 3D objects on a images.
- 📚 bayes-filters-lib
- a recursive Bayesian estimation library.