The future of robotics is now trending for home servicing. Nursing homes and assistance to elder peopleare areas where robots can provide valuable help in order to improve the quality of life of those who need it most. Calling a robot,for a person of age,can be a daunting task if the voice is failing and any resort to battery operated devices failsto comply. Using a simple mechanical apparatus,such as aClick trainerfordogs, a person can call a robot by pressing thebutton of a powerless device. The high pitch sound produced by this device can be captured and tracked down in order to estimate the person’s location within a room. This paper describes a method that provides good accuracy and uses simple and low cost technology,in order to provide an efficient positional value for an assistance robot to attend its caller. The robot does not need to search for the person in aroom as it can directly travel towards the Click’s sound source. ; info:eu-repo/semantics/publishedVersion


    Access

    Download


    Export, share and cite



    Title :

    Tracking sound source localization for a home robot application



    Publication date :

    2016-02-01



    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Classification :

    DDC:    629




    Carry luggage tracking home robot

    LEE JIN HYUK | European Patent Office | 2019

    Free access

    A mobile robot with active localization and disrimination of a sound source

    Wang, F. / Takeuchi, Y. / Ohnishi, N. et al. | British Library Online Contents | 1997


    Do We Need Sound for Sound Source Localization?

    Oya, Takashi / Iwase, Shohei / Natsume, Ryota et al. | British Library Conference Proceedings | 2021


    EXTERNAL MICROPHONE ARRAYS FOR SOUND SOURCE LOCALIZATION

    CHNG CHOON PING / WU CHENG-HAN / BALACHANDRAN GANESH et al. | European Patent Office | 2023

    Free access

    EXTERNAL MICROPHONE ARRAYS FOR SOUND SOURCE LOCALIZATION

    CHNG CHOON / WU DENNIS / BALACHANDRAN GANESH et al. | European Patent Office | 2022

    Free access