A major drawback in many robotics projects is the dependance on a specific environment and the otherwise uncertain behavior of the hardware. Simple navigation tasks like driving in a straight line can lead to a strong lateral drift over time in an unknown environment. In this paper we propose a fast and simple solution for the lateral drift problem for vision guided robots by real-time scene analysis. Without an environment-specific calibration of the robot’s drive system, we balance the differential drive speed on the fly. Therefore, a feature detector is used on consecutive images. Detected feature points determine the focus of expansion (FOE) that is used for locating and correcting the robot’s lateral drift. Results are presented for an unmodified real-world indoor environment that demonstrate that our method is able to correct most lateral drift, solely based on real-time vision processing.
Realtime vision-based lateral drift correction
2009-03-02
Hübner, T; Pajarola, R (2009). Realtime vision-based lateral drift correction. In: EUROGRAPHICS, Munich, Germany, 28 February 2009 - 2 March 2009, 13-16.
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
DDC: | 629 |
Implementing Image Processing Algorithms on FPGA-based Realtime Vision System
British Library Conference Proceedings | 2003
|Realtime head and hands tracking by monocular vision
IEEE | 2005
|Realtime Head and Hands Tracking by Monocular Vision
British Library Conference Proceedings | 2005
|The realtime Vision System for small-sized target tracking
British Library Online Contents | 2007
|The realtime Vision System for small-sized target tracking
British Library Online Contents | 2007
|