Background modeling is an important component of many vision systems. Existing work in the area has mostly addressed scenes that consist of static or quasi-static structures. When the scene exhibits a persistent dynamic behavior in time, such an assumption is violated and detection performance deteriorates. In this paper, we propose a new method for the modeling and subtraction of such scenes. Towards the modeling of the dynamic characteristics, optical flow is computed and utilized as a feature in a higher dimensional space. Inherent ambiguities in the computation of features are addressed by using a data-dependent bandwidth for density estimation using kernels. Extensive experiments demonstrate the utility and performance of the proposed approach.
Motion-based background subtraction using adaptive kernel density estimation
2004-01-01
706064 byte
Conference paper
Electronic Resource
English
Motion-Based Background Subtraction Using Adaptive Kernel Density Estimation
British Library Conference Proceedings | 2004
|Object Tracking Using Background Subtraction and Motion Estimation in MPEG Videos
British Library Conference Proceedings | 2006
|Object Tracking Using Background Subtraction and Motion Estimation in MPEG Videos
Springer Verlag | 2006
|Adaptive maintenance scheme for codebook-based dynamic background subtraction
British Library Online Contents | 2016
|