Welcome to the Nao Learning From Demonstrations repository! This project integrates robotic control, machine learning, and simulation environments for real-world and virtual robotic tasks. From motion prediction to grasping and pushing simulations, this repository provides all the necessary tools and scripts.
This repository is organized into several modules and folders, each designed to handle specific aspects of robotic control and simulation:
- KinectSkeleton: Processes skeleton data captured from Kinect sensors for motion mapping.
- Notebooks: Jupyter notebooks for training machine learning models.
- PostProcessing: Prepares and cleans datasets for training and evaluation.
- RealWorldScripts: Integrates models and robotic control for real-world tasks.
- Simulation: Scripts and scene files for simulating robotic interactions in CoppeliaSim.
- Train machine learning models to predict robotic arm movements.
- Supports tasks like grasping, pushing.
- CoppeliaSim scenes for realistic simulations.
- Includes grasping and pushing tasks with detailed configurations.
- Scripts for cleaning, interpolating, and labeling data.
- Jupyter notebooks for model development and evaluation.
- Supports integration with robotic arms and Kinect sensors.
- Endpoints for model predictions and robotic control.
-
Clone the Repository:
git clone https://github.com/TaarLab/NaoLfD.git cd NaoLfD
-
Set Up Dependencies:
- Install Python requirements:
pip install -r requirements.txt
- Install CoppeliaSim.
- Install Python requirements:
-
Explore Modules:
- Train models in
Notebooks
or process data inPostProcessing
.
- Train models in
-
Real-World Integration:
- Set up Kinect sensors or robotic arms and run
RealWorldScripts
.
- Set up Kinect sensors or robotic arms and run