ICRA 2011 Paper Abstract


Paper TuA103.1

Miksik, Ondrej (Brno University of Technology), Petyovsky, Petr (Brno University of Technology), Zalud, Ludek (Brno University of Technology), Jura, Pavel (Brno University of Technology)

Robust Detection of Shady and Highlighted Roads for Monocular Camera Based Navigation of UGV

Scheduled for presentation during the Regular Sessions "Autonomous Navigation I" (TuA103), Tuesday, May 10, 2011, 08:20−08:35, Room 3D

2011 IEEE International Conference on Robotics and Automation, May 9-13, 2011, Shanghai International Conference Center, Shanghai, China

This information is tentative and subject to change. Compiled on April 2, 2020

Keywords Autonomous Navigation, Visual Navigation, Computer Vision for Robotics and Automation


This paper addresses the problem of UGV navigation in various environments and lightning conditions. Previous approaches use a combination of different sensors, or work well, only in scenarios with noticeable road marking or borders. Our robot is used for chemical, nuclear and biological contamination measurement. Thus, to avoid complications with decontamination, only a monocular camera serves as a sensor since it is already equipped. In this paper, we propose a novel approach - a fusion of frequency based vanishing point estimation and probabilistically based color segmentation. Detection of a vanishing point, is based on the estimation of a texture flow, produced by a bank of Gabor wavelets and a voting function. Next, the vanishing point defines the training area, which is used for self-supervised learning of color models. Finally, road patches are selected by measuring of the roadness score. A few rules deal with dark cast shadows, overexposed highlights and adaptivity speed. In addition to the robustness of our system, it is easy-to-use since no calibration is needed.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2020 PaperCept, Inc.
Page generated 2020-04-02  12:20:47 PST  Terms of use