Grasping and manipulation of objects is an integral part of a robot’s physical interaction with the environment. In order to cope with real-world situations, sensor based grasping of objects and grasp stability estimation is an important skill. This thesis addresses the problem of predicting the stability of a grasp from the perceptions available to a robot once fingers close around the object before attempting to lift it. A regrasping step can be triggered if an unstable grasp is identified. The percepts considered consist of object features (visual), gripper configurations (proprioceptive) and tactile imprints (haptic) when fingers contact the object. This thesis studies tactile based stability estimation by applying machine learning methods such as Hidden Markov Models. An approach to integrate visual and tactile feedback is also introduced to further improve the predictions of grasp stability, using Kernel Logistic Regression models. Like humans, robots are expected to grasp and manipulate objects in a goal-oriented manner. In other words, objects should be grasped so to afford subsequent actions: if I am to hammer a nail, the hammer should be grasped so to afford hammering. Most of the work on grasping commonly addresses only the problem of finding a stable grasp without considering the task/action a robot is supposed to fulfill with an object. This thesis also studies grasp stability assessment in a task-oriented way based on a generative approach using probabilistic graphical models, Bayesian Networks. We integrate high-level task information introduced by a teacher in a supervised setting with low-level stability requirements acquired through a robot’s exploration. The graphical model is used to encode probabilistic relationships between tasks and sensory data (visual, tactile and proprioceptive). The generative modeling approach enables inference of appropriate grasping configurations, as well as prediction of grasp stability. Overall, results indicate that the idea of exploiting learning approaches for ...


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Learning to Assess Grasp Stability from Vision, Touch and Proprioception


    Beteiligte:

    Erscheinungsdatum :

    2012-01-01


    Medientyp :

    Hochschulschrift


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Klassifikation :

    DDC:    629



    Contact Information from Proprioception

    Huber, M. / Grupen, R. A. | British Library Conference Proceedings | 1993


    Modeling peripersonal action space for virtual humans using touch and proprioception

    Nguyen, Nhung / Wachsmuth, Ipke / Ruttkay, Zsofia et al. | BASE | 2009

    Freier Zugriff



    The Roles of Vision and Proprioception in the Planning of Reaching Movements

    Sarlegna, F.R. / Sainburg, R.L. | British Library Conference Proceedings | 2009