ICRA'09 Paper Abstract


Paper FrA3.4

Fairfield, Nathaniel (Carnegie Mellon University), Wettergreen, David (Carnegie Mellon University)

Evidence Grid-Based Methods for 3D Map Matching

Scheduled for presentation during the Regular Sessions "Mapping - I" (FrA3), Friday, May 15, 2009, 09:30−09:50, Room: 401

2009 IEEE International Conference on Robotics and Automation, May 12 - 17, 2009, Kobe, Japan

This information is tentative and subject to change. Compiled on January 24, 2022

Keywords Mapping, Mining Robotics, Range Sensing


Registering multiple sets of 3D range data is a crucial capability for robots. The standard method for matching two sets of range data is to convert the ranges to a point cloud representation, and then use on of the many variants of Iterative Closest Point (ICP).

We present a set of alternative methods for matching 3D range scans based on a different data representation: evidence grid maps. Evidence grids are robust to noise and variations in point density, can incorporate an indefinite number of ranges, and explicitly encode empty as well as occupied space. While 3D evidence grids can be huge when naively implemented, we use an optimized octree data structure to efficiently store sparse volumetric maps. To register a series of range scans, we build an evidence grid map for each scan, and then register them together using a several different methods.

The first two methods are based on a 3D extension of the classic 2D Lucas-Kanade template matching method, and differ only in whether we match a single large region, or multiple small regions that are selected heuristically. Our third method involves extracting surfaces from the evidence grids, and then running ICP to register the surfaces.

We demonstrate our methods and compare them to ICP using two datasets collected by two different subterranean robots.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2022 PaperCept, Inc.
Page generated 2022-01-24  06:59:08 PST  Terms of use