We present a method of bidirectional interaction between a human and a humanoid robot in terms of emotional expressions. The robot is able to detect continuous transitions of human emotions that ranges from very sad to very happy using Active Appearance Models (AAMs) and Neural Evolution Algorithm to determinate the face shape and gestures. As a response of the human emotions, the robot performs postural reactions that dynamically adapt to the human expressions, performing a body language which changes in terms of intensity as the human emotions vary. Our method is implemented in the HOAP-3 humanoid robot. ; The research leading to these results has received funding from the COMANDER project CCG10-UC3M/DPI-5350 funded by Comunidad de Madrid and UC3M (University Carlos III of Madrid), and ARCADIA project DPI2010-21047-C02-01 funded by CICYT project grant on behalf of Spanish Ministry of Economy and Competitiveness. ; Publicado
Facial emotion recognition and adaptative postural reaction by a humanoid based on neural evolution
2013-10-01
AR/0000015745
Article (Journal)
Electronic Resource
English
DDC: | 629 |
Adaptive facial point detection and emotion recognition for a humanoid robot
British Library Online Contents | 2015
|Adaptive facial point detection and emotion recognition for a humanoid robot
British Library Online Contents | 2015
|Pose-invariant descriptor for facial emotion recognition
British Library Online Contents | 2016
|