ICRA'09 Paper Abstract


Paper FrC4.2

Ngo, Thanh Trung (ISIR, Osaka University), Nagahara, Hajime ( Graduate School of Engineering Science , Osaka University), Sagawa, Ryusuke (Osaka University), Mukaigawa, Yasuhiro (Osaka University), Yachida, Masahiko ( Graduate School of Engineering Science , Osaka University), Yagi, Yasushi (Osaka University)

An Adaptive-Scale Robust Estimator for Motion Estimation

Scheduled for presentation during the Regular Sessions "Motion and Path Planning - III" (FrC4), Friday, May 15, 2009, 13:50−14:10, Room: 402

2009 IEEE International Conference on Robotics and Automation, May 12 - 17, 2009, Kobe, Japan

This information is tentative and subject to change. Compiled on January 24, 2022

Keywords Motion and Path Planning, Omnidirectional Vision


Although RANSAC is the most widely used robust estimator in computer vision, it has certain limitations making it ineffective in some situations, such as the motion estimation problem, in which uncertainty on the image features changes according to the capturing conditions. The greatest problem is that the threshold used by RANSAC to detect inliers cannot be changed adaptively; instead it is fixed by the user. An adaptive scale algorithm must therefore be applied in such cases. In this paper, we propose a new adaptive scale robust estimator that adaptively finds the best solution with the best scale to fit the inliers, without the need for predefined information. Our new adaptive scale estimator matches the residual probability density from an estimate and the standard Gaussian probability density function to find the best inlier scale. Our algorithm is evaluated in several motion estimation experiments under varying conditions and the results are compared with several of the latest adaptive-scale robust estimators.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2022 PaperCept, Inc.
Page generated 2022-01-24  06:14:56 PST  Terms of use