This article offers a brief overview of multimodal (speech, touch, gaze, etc.) input theory as it pertains to common invehicle tasks and devices. After a brief introduction, we walk through a sample multimodal interaction, detailing the steps involved and how information necessary to the interaction can be obtained by combining input modes in various ways. We also discuss how contemporary in-vehicle systems take advantage of multimodality (or fail to do so), and how the capabilities of such systems might be broadened in the future via clever multimodal input mechanisms.


    Access

    Access via TIB

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Situation-Aware, User-Centric Multimodality for Automotive


    Contributors:

    Conference:

    AmE 2011 - Automotive meets Electronics - Beiträge der 2. GMM-Fachtagung ; 2011 ; Dortmund, Germany



    Publication date :

    2011-01-01


    Size :

    4 pages



    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Situation-Aware, User-Centric Multimodality for Automotive

    Müller, Christian | Tema Archive | 2011


    Multimodality

    Nobis, Claudia | Transportation Research Record | 2007


    SITUATION AWARE PERSONAL ASSISTANT

    JOO TAE HONG / ELABBADY TAREK Z / HABIB MONA SOLIMAN | European Patent Office | 2019

    Free access

    Towards information centric automotive system architecture

    Teepe,G. / Remboski,D. / Baker,R. et al. | Automotive engineering | 2002


    Towards Information Centric Automotive System Architectures

    Teepe, G. / Remboski, D. / Baker, R. et al. | British Library Conference Proceedings | 2002