ICRA 2011 Paper Abstract


Paper WeP214.1

Choi, Changhyun (Georgia Institute of Technology), Christensen, Henrik Iskov (Georgia Institute of Technology)

Robust 3D Visual Tracking Using Particle Filtering on the SE(3) Group

Scheduled for presentation during the Regular Sessions "Visual Tracking" (WeP214), Wednesday, May 11, 2011, 15:25−15:40, Room 5J

2011 IEEE International Conference on Robotics and Automation, May 9-13, 2011, Shanghai International Conference Center, Shanghai, China

This information is tentative and subject to change. Compiled on July 14, 2020

Keywords Visual Tracking, Computer Vision for Robotics and Automation


In this paper, we present a 3D model-based object tracking approach using edge and keypoint features in a particle filtering framework. Edge points provide 1D information for pose estimation and it is natural to consider multiple hypotheses. Recently, particle filtering based approaches have been proposed to integrate multiple hypotheses and have shown good performance, but most of the work has made an assumption that an initial pose is given. To remove this assumption, we employ keypoint features for initialization of the filter. Given 2D-3D keypoint correspondences, we choose a set of minimum correspondences to calculate a set of possible pose hypotheses. Based on the inlier ratio of correspondences, the set of poses are drawn to initialize particles. For better performance, we employ an autoregressive state dynamics and apply it to a coordinate-invariant particle filter on the SE(3) group. Based on the number of effective particles calculated during tracking, the proposed system re-initializes particles when the tracked object goes out of sight or is occluded. The robustness and accuracy of our approach is demonstrated via comparative experiments.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2020 PaperCept, Inc.
Page generated 2020-07-14  16:57:15 PST  Terms of use