ICRA'09 Paper Abstract

Close

Paper FrC2.5

Xu, Tingting (Technische Universität München), Wu, Hao (TUM), Zhang, Tianguang (Technische Universität München), Kühnlenz, Kolja (Technische Universität München), Buss, Martin (Technische Universität München)

Environment Adapted Active Multi-Focal Vision System for Object Detection

Scheduled for presentation during the Regular Sessions "Computer Vision for Robotics and Automation - IV" (FrC2), Friday, May 15, 2009, 14:50−15:10, Room: ICR

2009 IEEE International Conference on Robotics and Automation, May 12 - 17, 2009, Kobe, Japan

This information is tentative and subject to change. Compiled on January 24, 2022

Keywords Computer Vision for Robotics and Automation, Biologically-Inspired Robots, Learning and Adaptive Systems

Abstract

A biologically inspired foveated attention system in an object detection scenario is proposed. Thereby, a high-performance active multi-focal camera system imitates visual behaviors such as scan, saccade and fixation. Bottom-up attention uses wide-angle stereo data to select a sequence of fixation points in the peripheral field of view. Successive saccade and fixation of high foveal resolution using a telephoto camera enables high accurate object recognition. Once an object is recognized as target object, the bottom-up attention model is adapted to the current environment, using the top-down information extracted from this target object. The bottom-up attention model and the object recognition algorithm based on SIFT are implemented using CUDA technology on Graphics Processing Units (GPUs), which highly accelerates image processing. In the experimental evaluation, all the target objects were detected in different backgrounds. Evident improvements in accuracy, flexibility and efficiency are achieved.

 

 

Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2022 PaperCept, Inc.
Page generated 2022-01-24  07:27:33 PST  Terms of use