ICRA 2012 Paper Abstract

Close

Paper TuC01.1

Melnyk, Igor (University of Minnesota), Hesch, Joel (University of Minnesota), Roumeliotis, Stergios (University of Minnesota)

Cooperative Vision-Aided Inertial Navigation Using Overlapping Views

Scheduled for presentation during the Regular Session "Autonomy and Vision for UAVs" (TuC01), Tuesday, May 15, 2012, 14:30−14:45, Meeting Room 1 (Mini-sota)

2012 IEEE International Conference on Robotics and Automation, May 14-18, 2012, RiverCentre, Saint Paul, Minnesota, USA

This information is tentative and subject to change. Compiled on December 13, 2017

Keywords Visual Navigation, Aerial Robotics, Sensor Fusion

Abstract

In this paper, we study the problem of Cooperative Localization (CL) for two robots, each equipped with an Inertial Measurement Unit (IMU) and a camera. We present an algorithm that enables the robots to exploit common features, observed over a sliding-window time horizon, in order to improve the localization accuracy of both robots. In contrast to existing CL methods, which require distance and/or bearing robot-to-robot observations, our algorithm infers the relative position and orientation (pose) of the robots using only the visual observations of common features in the scene. Moreover, we analyze the system observability properties to determine how many degrees of freedom (d.o.f.) of the relative transformation can be computed under different measurement scenarios. Lastly, we present simulation results to evaluate the performance of the proposed method.

 

 

Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2017 PaperCept, Inc.
Page generated 2017-12-13  22:10:40 PST  Terms of use