ICRA 2011 Paper Abstract


Paper TuP1-InteracInterac.33

Mörwald, Thomas (Vienna University of Technology), Kopicki, Marek Sewer (University of Birmingham), Stolkin, Rustam (University of Birmingham), Wyatt, Jeremy (University of Birmingham), Zillich, Michael (Vienna University of Technology), Vincze, Markus (Vienna University of Technology), Zurek, Sebastian (University of Birmingham, UK)

Predicting the Unobservable - Visual 3D Tracking with a Probabilistic Motion Model

Scheduled for presentation during the Poster Sessions "Interactive Session II: Systems, Control and Automation" (TuP1-InteracInterac), Tuesday, May 10, 2011, 13:40−14:55, Hall

2011 IEEE International Conference on Robotics and Automation, May 9-13, 2011, Shanghai International Conference Center, Shanghai, China

This information is tentative and subject to change. Compiled on April 2, 2020

Keywords Visual Tracking, Computer Vision for Robotics and Automation


Visual tracking of an object can provide a powerful source of feedback information during complex robotic manipulation operations, especially those in which there may be uncertainty about which new object pose may result from a planned manipulative action. At the same time, robotic manipulation can provide a challenging environment for visual tracking, with occlusions of the object by other objects or by the robot itself, and sudden changes in object pose that may be accompanied by motion blur. Recursive filtering techniques use motion models for predictor-corrector tracking, but the simple models typically used often fail to adequately predict the complex motions of manipulated objects. We show how statistical machine learning techniques can be used to train sophisticated motion predictors, which incorporate additional information by being conditioned on the planned manipulative action being executed. We then show how these learned predictors can be used to propagate the particles of a particle filter from one predictor-corrector step to the next, enabling a visual tracking algorithm to maintain plausible hypotheses about the location of an object, even during severe occlusion and other difficult conditions. We demonstrate the approach in the context of robotic push manipulation, where a 5-axis robot arm equipped with a rigid finger applies a series of pushes to an object, while it is tracked by a vision algorithm using a single camera.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2020 PaperCept, Inc.
Page generated 2020-04-02  12:30:37 PST  Terms of use