We present an experimental study comparing various Recurrent Neural Network architectures for the task of Vulnerable Road User (VRU) motion trajectory prediction in the intelligent vehicle domain. Making use of temporal motion cues and visual appearance features, we design multi-cue RNN-based architectures with dedicated optimization process to predict future moving trajectories from historical consecutive frames. Experiments are performed on image sequences recorded from on-board a moving vehicle and public tracking datasets. In particular, the Tsinghua-Daimler Cyclist Benchmark (TDCB) has been augmented with additional annotations (vari-ous VRU types) to support the evaluation of object tracking approaches and trajectory prediction methods. This newly introduced dataset is termed TDCB-Track. We demonstrate the effectiveness of the proposed RNN architectures on the public MOT16 dataset and the TDCB-Track dataset. We show that the proposed approaches outperform simpler baseline methods and stay ahead with the state-of-the-art.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Recurrent Neural Network Architectures for Vulnerable Road User Trajectory Prediction


    Contributors:
    Xiong, Hui (author) / Flohr, Fabian B. (author) / Wang, Sijia (author) / Wang, Baofeng (author) / Wang, Jianqiang (author) / Li, Keqiang (author)


    Publication date :

    2019-06-01


    Size :

    1667076 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English





    Clustering-Based Trajectory Prediction of Vehicles Interacting with Vulnerable Road Users

    Sonka, Adrian / Henze, Roman / Thal, Silvia | SAE Technical Papers | 2021


    Pose Based Trajectory Forecast of Vulnerable Road Users

    Kress, Viktor / Zernetsch, Stefan / Doll, Konrad et al. | IEEE | 2019


    COLLISION PREDICTION DETERMINATION DEVICE, AND VULNERABLE ROAD USER PROTECTION SYSTEM

    UMEZAWA MASAKI / OKAMURA KENYU / NAKAMURA HIDETOSHI et al. | European Patent Office | 2020

    Free access