ICRA 2011 Paper Abstract

Close

Paper TuP214.1

Hwangbo, Myung (Carnegie Mellon University), Kanade, Takeo (Carnegie Mellon University)

Visual-Inertial UAV Attitude Estimation Using Urban Scene Regularities

Scheduled for presentation during the Regular Sessions "Visual Navigation IV" (TuP214), Tuesday, May 10, 2011, 15:25−15:40, Room 5J

2011 IEEE International Conference on Robotics and Automation, May 9-13, 2011, Shanghai International Conference Center, Shanghai, China

This information is tentative and subject to change. Compiled on April 2, 2020

Keywords Visual Navigation, Aerial Robotics, Sensor Fusion

Abstract

We present a drift-free attitude estimation method that uses image line segments for the correction of accumulated errors in integrated gyro rates when an unmanned aerial vehicle (UAV) operates in urban areas. Since man-made environments generally exhibit strong regularity in structure, a set of line segments that are either parallel or orthogonal to the gravitational direction can provide visual measurements for the absolute attitude from a calibrated camera.

Line segments are robustly classified with the assumption that a single vertical vanishing point or multiple horizontal vanishing points exist. In the fusion with gyro angles, we introduce a new Kalman update step that directly uses line segments rather than vanishing points. The simulation and experiment based on urban images at distant views are provided to demonstrate that our method can serve as a robust visual attitude sensor for aerial robot navigation.

 

 

Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2020 PaperCept, Inc.
Page generated 2020-04-02  11:53:15 PST  Terms of use