ICRA 2011 Paper Abstract


Paper TuA110.4

Kakiuchi, Yohei (The University of Tokyo), Ueda, Ryohei (The University of Tokyo), Okada, Kei (The University of Tokyo), Inaba, Masayuki (The University of Tokyo)

Creating Household Environment Map for Environment Manipulation Using Color Range Sensors on Environment and Robot

Scheduled for presentation during the Regular Sessions "Localization and Mapping I" (TuA110), Tuesday, May 10, 2011, 09:05−09:20, Room 5E

2011 IEEE International Conference on Robotics and Automation, May 9-13, 2011, Shanghai International Conference Center, Shanghai, China

This information is tentative and subject to change. Compiled on April 2, 2020

Keywords Humanoid Robots, Localization, Sensor Fusion


A humanoid robot working in a household environment with people needs to localize and continuously update the locations of obstacles and manipulable objects. Achieving such system, requires strong perception method to efficiently update the frequently changing environment. We propose a method for mapping a household environment using multiple stereo and depth cameras located on the humanoid head and the environment. The method relies on colored 3D point cloud data computed from the sensors. We achieve robot localization by matching the point clouds from the robot sensor data directly with the environment sensor data. Object detection is performed using Iterative Closest Point (ICP) with a database of known point cloud models. In order to guarantee accurate object detection results, objects are only detected within the robot sensor data. Furthermore, we utilize the environment sensor data to map out of the obstacles as bounding convex hulls. We show experimental results creating a household environment map with known object labels and estimate the robot position in this map.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2020 PaperCept, Inc.
Page generated 2020-04-02  12:33:04 PST  Terms of use