ICRA 2011 Paper Abstract


Paper WeP214.2

Cho, Hyunggi (Carnegie Mellon University), Rybski, Paul E. (Carnegie Mellon University), Zhang, Wende (General Motors)

Vision-Based 3D Bicycle Tracking Using Deformable Part Model and Interacting Multiple Model Filter

Scheduled for presentation during the Regular Sessions "Visual Tracking" (WeP214), Wednesday, May 11, 2011, 15:40−15:55, Room 5J

2011 IEEE International Conference on Robotics and Automation, May 9-13, 2011, Shanghai International Conference Center, Shanghai, China

This information is tentative and subject to change. Compiled on July 5, 2020

Keywords Intelligent Transportation Systems, Human detection & tracking, Visual Tracking


This paper presents a monocular vision based 3D bicycle tracking framework for intelligent vehicles based on a detection method exploiting a deformable part model and a tracking method using an Interacting Multiple Model (IMM) filter. From a driving safety perspective, bicycle tracking is important because bicycles share the road with vehicles and can move at comparable speeds in urban environments. To this end, we present a tracking-by-detection method wihch involves the following three components. First, a mixture model of multiple viewpoints is defined and trained via a latent Support Vector Machine (LSVM) to detect bicycles. Secondly, two motion models based on bicycle’s kinematics are fused using an IMM filter. For each motion model, an extended Kalman filter (EKF) is used to estimate the pose of a bicycle in the world. Finally, multiple bicycles are tracked by incorporating a Rao-Blackwellized Particle Filter which addresses the data association problem. We demonstrate the effectiveness of this approach through a series of experiments on a new bicycle dataset captured from a vehicle-mounted camera.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2020 PaperCept, Inc.
Page generated 2020-07-05  03:47:04 PST  Terms of use