This paper presents an implementation of an aircraft pose and motion estimator using visual systems as the principal sensor for controlling an Unmanned Aerial Vehicle (UAV) or as a redundant system for an Inertial Measure Unit (IMU) and gyros sensors. First, we explore the applications of the unified theory for central catadioptric cameras for attitude and heading estimation, explaining how the skyline is projected on the catadioptric image and how it is segmented and used to calculate the UAV's attitude. Then we use appearance images to obtain a visual compass, and we calculate the relative rotation and heading of the aerial vehicle. Additionally, we show the use of a stereo system to calculate the aircraft height and to measure the UAV's motion. Finally, we present a visual tracking system based on Fuzzy controllers working in both a UAV and a camera pan and tilt platform. Every part is tested using the UAV COLIBRI platform to validate the different approaches, which include comparison of the estimated data with the inertial values measured onboard the helicopter platform and the validation of the tracking schemes on real flights.
Unmanned aerial vehicles UAVs attitude, height, motion estimation and control using visual systems
Autonomous Robots ; 29 , 1 ; 17-34
2010
18 Seiten, 42 Quellen
Article (Journal)
English
Unmanned aerial vehicles UAVs attitude, height, motion estimation and control using visual systems
British Library Online Contents | 2010
|Omnidirectional vision applied to Unmanned Aerial Vehicles (UAVs) attitude and heading estimation
BASE | 2010
|Unmanned Aerial Vehicles (UAVs) in Firefighting
Springer Verlag | 2024
|Drone/Unmanned Aerial Vehicles (UAVs) Technology
Springer Verlag | 2021
|