ICRA 2012 Paper Abstract


Paper WeB03.6

Hebert, Paul (California Institute of Technology), Hudson, Nicolas (Jet Propulsion Laboratory), Ma, Jeremy (Jet Propulsion Laboratory), Howard, Tom (Jet Propulsion Laboratory), Fuchs, Thomas (California Institute of Technology), Bajracharya, Max (JPL), Burdick, Joel (California Institute of Technology)

Combined Shape, Appearance and Silhouette for Simultaneous Manipulator and Object Tracking

Scheduled for presentation during the Regular Session "Grasping: Learning and Estimation" (WeB03), Wednesday, May 16, 2012, 11:45−12:00, Meeting Room 3 (Mak'to)

2012 IEEE International Conference on Robotics and Automation, May 14-18, 2012, RiverCentre, Saint Paul, Minnesota, USA

This information is tentative and subject to change. Compiled on June 18, 2018

Keywords Grasping, Visual Tracking, Dexterous Manipulation


This paper develops an estimation framework for sensor-guided manipulation of a rigid object via a robot arm. Using an unscented Kalman Filter (UKF), the method combines dense range information (from stereo cameras and 3D ranging sensors) as well as visual appearance features and silhouettes of the object and manipulator to track both an object-fixed frame location as well as a manipulator tool or palm frame location. If available, tactile data is also incorporated. By using these different imaging sensors and different imaging properties, we can leverage the advantages of each sensor and each feature type to realize more accurate and robust object and reference frame tracking. The method is demonstrated using the DARPA ARM-S system, consisting of a Barrett WAM manipulator.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2018 PaperCept, Inc.
Page generated 2018-06-18  00:47:38 PST  Terms of use