Renner P, Pfeiffer T, Wachsmuth I. Spatial references with gaze and pointing in shared space of humans and robots. In: Freksa C, Nebel B, Hegarty M, Barkowsky T, eds. Spatial Cognition IX . Lecture Notes in Computer Science. Vol 8684. Berlin [u.a.]: Springer; 2014: 121-136. ; For solving tasks cooperatively in close interaction with humans, robots need to have timely updated spatial representations. However, perceptual information about the current position of interaction partners is often late. If robots could anticipate the targets of upcoming manual actions, such as pointing gestures, they would have more time to physically react to human movements and could consider prospective space allocations in their planning. Many findings support a close eye-hand coordination in humans which could be used to predict gestures by observing eye gaze. However, effects vary strongly with the context of the interaction. We collect evidence of eye-hand coordination in a natural route planning scenario in which two agents interact over a map on a table. In particular, we are interested if fixations can predict pointing targets and how target distances affect the interlocutor's pointing behavior. We present an automatic method combining marker tracking and 3D modeling that provides eye and gesture measurements in real-time.
Spatial references with gaze and pointing in shared space of humans and robots
2014-01-01
Aufsatz/Kapitel (Buch)
Elektronische Ressource
Englisch
Inspiring robots: developmental trajectories of gaze following in humans
BASE | 2020
|Humans and Robots in Space Exploration
Springer Verlag | 2021
|Humans versus robots for space exploration and development
Online Contents | 2003
|Humans versus robots for space exploration and development
Elsevier | 2003
|SOCIAL ROBOTS / SOCIAL COGNITION : Robots' Gaze Effects in Older and Younger Adults
BASE | 2023
|