Proceedings of: 2010 IEEE Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), September 5-7, 2010, Salt Lake City, USA ; This paper proposes multi sensor fusion based on an effective calibration method for a perception system designed for mobile robots and intended for later object recognition. The perception system consists of a camera and a three-dimensional laser range finder. The three-dimensional laser range finder is based on a two-dimensional laser scanner and a pan-tilt unit as a moving platform. The calibration permits the coalescence of the two most important sensors for three-dimensional environment perception, namely a laser scanner and a camera. Both sensors permit multi sensor fusion consisting of color and depth information. The calibration process based upon a specific calibration pattern is used to define the extrinsic parameters and calculate the transformation between a laser range finder and a camera. The found transformation assigns an exact position and the color information to each point of the surroundings. As a result, the advantages of both sensors can be combined. The resulting structure consists of colored unorganized point clouds. The achieved results can be visualized with OpenGL and used for surface reconstruction. This way, typical robotic tasks like object recognition, grasp calculation or handling of objects can be realized. The results of our experiments are presented in this paper. ; European Community's Seventh Framework Program
Multi sensor fusion of camera and 3D laser range finder for object recognition
2010-10-01
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
DDC: | 629 |