Real-time avoidance for Human-Robot Interaction


Physical-HRI-Controller

Reference paper: Compact real-time avoidance on a Humanoid Robot for Human-Robot Interaction [PDF] [BIB]

Authors: Phuong D. H. Nguyen, Matej Hoffmann, Alessandro Roncone, Ugo Pattacini, and Giorgio Metta

Submission: ACM/IEEE International Conference on Human-Robot Interaction (HRI2018), Chicago, IL, USA, March 5-8, 2018

We have developed a new framework for safe interaction of a robot with a human composed of the following main components (cf. Fig 1):

  1. a human 2D keypoints estimation pipeline employing a deep learning based algorithm, extended here into 3D using disparity;
  2. a distributed peripersonal space representation around the robot’s body parts;
  3. a new reaching controller that incorporates all obstacles entering the robot’s protective safety zone on the fly into the task.
portfolio/PhysicalHRIController_architecture.png

Figure 1. Software architecture for physical human-robot interaction. In this work, we develop a framework composed of: i) a human pose estimation algorithm, ii) a 2D to 3D disparity mapping, iii) a peripersonal space collision predictor, iv) a pHRI robot controller for distributed collision avoidance.

Here’s another video of the controller in action:

Controller in action. The controller can seamlessly process both tactile (physical contact) and visual (prior-to-contact) information in order to avoid obstacles while still optimizing its trajectory to accomplish the primary reaching task.

The main novelty lies in the formation of the protective safety margin around the robot’s body parts—in a distributed fashion and adhering closely to the robot structure—and its use in a reaching controller that dynamically incorporates threats in its peripersonal space into the task. The framework is tested in real experiments that reveal the effectiveness of this approach in protecting both human and robot against collisions during the interaction. Our solution is compact, self-contained (on-board stereo cameras in the robot’s head being the only sensor), and flexible, as different modulations of the defensive peripersonal space are possible—here we demonstrate stronger avoidance of the human head compared to rest of the body.

About


Personal website of Alessandro Roncone, Ph.D. in robotics, computer scientist, team lead, father, and runner. From August 2018, he became Assistant Professor of Robotics in the CS Department at CU Boulder, where he directs the Human Interaction and RObotics [HIRO] Group. Since January 2022, Alessandro is also Chief Technology Officer of Lab0.

Alessandro has more than years' research experience in robotics. He worked with the iCub robot, one of the most advanced humanoid robots, on machine perception, robot control, and artificial intelligence. His research focus is in human-robot interaction: with the goal of creating robots that work with and around people, he focuses on motion planning, artificial intelligence, and robot control. This website is not actively maintained any more! Please visit my lab's website [LINK]