Gaze stabilization is fundamental for humanoid robots. By stabilizing vision, it enhances perception of the environment and keeps regions of interest inside the field of view. In this contribution, a multimodal gaze stabilization combining proprioceptive, inertial and visual cues is introduced. It integrates a classical inverse kinematic control with vestibulo-ocular and optokinetic reflexes. Inspired by neuroscience, our contribution implements a forward internal model that modulates the reflexes based on the reafference principle. This principle filters self-generated movements out of the reflexive feedback loop. The versatility and effectiveness of this method are experimentally validated on the ARMAR-III humanoid robot. We first demonstrate that all the stabilization mechanisms (inverse kinematics and reflexes) are complementary. Then, we show that our multimodal method, combining these three modalities with the reafference principle, provides a versatile gaze stabilizer able to handle a large panel of perturbations.
Multimodal gaze stabilization of a humanoid robot based on reafferences
2017-01-01
Conference paper
Electronic Resource
English
DDC: | 629 |
Adaptive gaze stabilization through cerebellar internal models in a humanoid robot
BASE | 2016
|Head stabilization in a humanoid robot: models and implementations
British Library Online Contents | 2017
|