Brain-controlled robots are a promising new type of assistive device for severely impaired persons. Little is however known about how to optimize the interaction of humans and brain-controlled robots. Information about the human’s perceived correctness of robot performance might provide a useful teaching signal for adaptive control algorithms and thus help enhancing robot control. Here, we studied whether watching robots perform erroneous vs. correct action elicits differential brain responses that can be decoded from single trials of electroencephalographic (EEG) recordings, and whether brain activity during human-robot interaction is modulated by the robot’s visual similarity to a human. To address these topics, we designed two experiments. In experiment I, participants watched a robot arm pour liquid into a cup. The robot performed the action either erroneously or correctly, i.e. it either spilled some liquid or not. In experiment II, participants observed two different types of robots, humanoid and non-humanoid, grabbing a ball. The robots either managed to grab the ball or not. We recorded high-resolution EEG during the observation tasks in both experiments to train a Filter Bank Common Spatial Pattern (FBCSP) pipeline on the multivariate EEG signal and decode for the correctness of the observed action, and for the type of the observed robot. Our findings show that it was possible to decode both correctness and robot type for the majority of participants significantly, although often just slightly, above chance level. Our findings suggest that non-invasive recordings of brain responses elicited when observing robots indeed contain decodable information about the correctness of the robot’s action and the type of observed robot. Our study also indicates that, given the, so far, relatively low decoding accuracies, either further improvements in non-invasive recording and analysis techniques or the utilization of intracranial measurements of neuronal activity will be necessary for practical applications.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Brain Responses During Robot-Error Observation


    Beteiligte:
    Welke, Dominik (Autor:in) / Behncke, Joos (Autor:in) / Hader, Marina (Autor:in) / Schirrmeister, Robin Tibor (Autor:in) / Schönau, Andreas (Autor:in) / Eßmann, Boris (Autor:in) / Müller, Oliver (Autor:in) / Burgard, Wolfram (Autor:in) / Ball, Tonio (Autor:in)

    Erscheinungsdatum :

    2017-09-26



    Medientyp :

    Aufsatz (Zeitschrift)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Klassifikation :

    DDC:    620 / 629



    Human behavioural responses to robot head gaze during robot-to-human handovers

    Zheng, Minhua / Moon, A Jung / Gleeson, Brian et al. | IEEE | 2014


    Underwater observation robot and observation method thereof

    LIU HONGTAO / CAO SONGPEI | Europäisches Patentamt | 2020

    Freier Zugriff

    Ecological underwater observation robot

    LIANG XIONGWEI / SUN YUE / MENG BO et al. | Europäisches Patentamt | 2021

    Freier Zugriff

    ROBOT FOR UNDERWATER OBSERVATION AND OBSERVATION METHOD THEREOF

    CHEN HUANRUO | Europäisches Patentamt | 2020

    Freier Zugriff

    Mobile Robot Learning by Self-Observation

    Russell, R. A. | British Library Online Contents | 2004