ICRA 2012 Paper Abstract

Close

Paper WeD310.2

Tanaka, Kazushi (Tohoku University), Takeuchi, Eijiro (Tohoku University), Ohno, Kazunori (Tohoku University), Tadokoro, Satoshi (Tohoku University), Yonezawa, Toru (GLORY LTD.)

Logical Winnowing Methods from Multiple Identification Candidates Using Corresponding Appearance Identification Results in Time-Series

Scheduled for presentation during the Interactive Session "Interactive Session WeD-3" (WeD310), Wednesday, May 16, 2012, 17:30−18:00, Ballroom D

2012 IEEE International Conference on Robotics and Automation, May 14-18, 2012, RiverCentre, Saint Paul, Minnesota, USA

This information is tentative and subject to change. Compiled on June 18, 2018

Keywords Intrusion Detection, Identification and Security, Sensor Fusion, Human Detection & Tracking

Abstract

This paper describes logical winnowing methods from multiple identification candidates using corresponding appearance identification results with chronological pedestrian tracking results. It is difficult to identify individual using appearance identification, because appearance identification has some properties. This research proposes two methods that logically winnow out the identification candidates as methods that effectively fuse different directional results without the directional information. Experiments were made to verify the validity of the proposed methods. A mobile robot equipped with a laser scanner and a camera was used in the experiments. A pedestrian tracking method uses the laser scanner. The appearance identification uses the camera. The experimental results verified the validity of the logical winnowing method taking the logical product of candidates determined by each round of identification. In this paper, the appearance identification properties, the proposed methods and the experiments are described.

 

 

Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2018 PaperCept, Inc.
Page generated 2018-06-18  00:56:34 PST  Terms of use