Skip to content

lagadic/visp

Folders and files

NameName
Last commit message
Last commit date
Dec 10, 2024
Oct 3, 2024
Nov 28, 2024
Dec 11, 2024
Dec 19, 2024
Dec 22, 2024
Nov 25, 2024
Dec 17, 2024
Dec 13, 2024
Jun 29, 2023
Jan 3, 2025
Nov 25, 2024
Jan 15, 2019
Dec 11, 2024
Dec 19, 2024
Dec 22, 2017
Dec 1, 2023
Aug 31, 2020
May 28, 2024
Dec 11, 2024
Oct 19, 2017
Mar 20, 2024
Nov 6, 2024
Jun 29, 2023
Sep 18, 2023
Jul 16, 2024
Apr 12, 2024
Sep 22, 2024
Aug 9, 2024
May 28, 2024
May 28, 2024

Repository files navigation

ViSP: Open source Visual Servoing Platform

Github Releases License

Platform Build Status
Ubuntu 20.04, 22.04 (amd64) ubuntu dep apt workflow ubuntu dep src workflow
macOS 13 and 14 macos workflow
iOS on macOS 11.0 ios workflow
Windows 10 Build status
Other arch Ubuntu 22.04 (aarch64, s390x) other arch workflow
ROS1 Noetic Ubuntu 20.04 Focal Build Status
ROS2 Humble Ubuntu 22.04 Jammy Build Status
ROS2 Iron Ubuntu 22.04 Jammy Build Status
ROS2 Rolling Ubuntu 22.04 Jammy Build Status
Valgrind valgrind workflow
Sanitizer sanitizers workflow
Code coverage Code coverage
Other projects Build Status
UsTK macOS Ubuntu
visp_contrib Ubuntu
visp_sample macos workflow ubuntu dep apt workflow
camera_localization ubuntu_3rdparty_workflow
visp_started ubuntu_3rdparty_workflow

ViSP is a cross-platform library (Linux, Windows, MacOS, iOS, Android) that allows prototyping and developing applications using visual tracking and visual servoing technics at the heart of the researches done now by Inria Rainbow team and before 2018 by Lagadic team. ViSP is able to compute control laws that can be applied to robotic systems. It provides a set of visual features that can be tracked using real time image processing or computer vision algorithms. ViSP provides also simulation capabilities. ViSP can be useful in robotics, computer vision, augmented reality and computer animation. Our YouTube channel gives an overview of the applications that could be tackled.

Citing ViSP

Please cite ViSP in your publications if it helps your research:

@article{Marchand05b,
   Author = {Marchand, E. and Spindler, F. and Chaumette, F.},
   Title = {ViSP for visual servoing: a generic software platform with a wide class of robot control skills},
   Journal = {IEEE Robotics and Automation Magazine},
   Volume = {12},
   Number = {4},
   Pages = {40--52},
   Publisher = {IEEE},
   Month = {December},
   Year = {2005}
}

To cite the generic model-based tracker:

@InProceedings{Trinh18a,
   Author = {Trinh, S. and Spindler, F. and Marchand, E. and Chaumette, F.},
   Title = {A modular framework for model-based visual tracking using edge, texture and depth features},
   BookTitle = {{IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, IROS'18}},
   Address = {Madrid, Spain},
   Month = {October},
   Year = {2018}
}

To cite pose estimation algorithms and hands-on survey illustrated with ViSP examples:

@article{Marchand16a,
   Author = {Marchand, E. and Uchiyama, H. and Spindler, F.},
   Title = {Pose estimation for augmented reality: a hands-on survey},
   Journal = {IEEE Trans. on Visualization and Computer Graphics},
   Volume = {22},
   Number = {12},
   Pages = {2633--2651},
   Month = {December},
   Year = {2016}
}

Resources

Contributing

Please read before starting work on a pull request: https://visp.inria.fr/contributing-code/