ICRA'09 Paper Abstract

Close

Paper FrB10.2

Cherian, Anoop (U. of Minnesota), Morellas, Vassilios (U. of Minnesota), Papanikolopoulos, Nikos (University of Minnesota)

Accurate 3D Ground Plane Estimation from a Single Image

Scheduled for presentation during the Regular Sessions "Visual Navigation - I" (FrB10), Friday, May 15, 2009, 10:50−11:10, Room: 502

2009 IEEE International Conference on Robotics and Automation, May 12 - 17, 2009, Kobe, Japan

This information is tentative and subject to change. Compiled on January 21, 2022

Keywords Computer Vision for Robotics and Automation, Visual Navigation, Localization

Abstract

Localization of a robot with respect to the features in its environment is a first step towards solving the SLAM problem. In this work, we propose algorithms to accurately estimate the location of a robot from a single image taken from its on board camera. Our approach differs from previous efforts in this domain in that it first reconstructs accurately the 3D environment from a single image, then it defines a coordinate system over the environment and later it performs the desired localization with respect to this coordinate system using the environment’s features. The ground plane from the given image is accurately estimated and this precedes segmentation of the image into ground and vertical regions. A Markov Random Field (MRF) based 3D reconstruction is performed to build an approximate depth map of the given image. This map is robust against texture variations due to shadows, terrain differences, etc. A texture segmentation algorithm is also applied to determine the ground plane accurately. Once the ground plane is estimated, we use the respective camera’s intrinsic and extrinsic calibration information to calculate accurate 3D information about the features in the scene, thereby achieving localization.

 

 

Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2022 PaperCept, Inc.
Page generated 2022-01-21  09:37:25 PST  Terms of use