Within this project our focus is on teaching human-like interactions to synthetic humanoids robots. We aim at learning complex interactions and have a virtual humans repeat those with a person.
This project focuses on a physics based simulator that allows kinesthetic interactions between a human and a robot to be recorded, and later used for imitation learning.
In many cooperative tasks between a human and a robotic assistant, the human guides the robot by exerting forces, either through direct physical interaction or indirectly via a jointly manipulated object.
This project is dedicated to the development of a universal remote control for a Nao robot. The framework utilizes the iPad’s internal accelerometer to calculate the robot’s walking, turning and gaze direction.