ICRA 2011 Paper Abstract


Paper TuA106.5

Sinapov, Jivko (Iowa State University), Stoytchev, Alexander (Iowa State University)

Object Category Recognition by a Humanoid Robot Using Behavior-Grounded Relational Learning

Scheduled for presentation during the Regular Sessions "Behaviour-Based Systems" (TuA106), Tuesday, May 10, 2011, 09:20−09:35, Room 5A

2011 IEEE International Conference on Robotics and Automation, May 9-13, 2011, Shanghai International Conference Center, Shanghai, China

This information is tentative and subject to change. Compiled on April 2, 2020

Keywords Behaviour-Based Systems, Learning and Adaptive Systems, Recognition


The ability to form and recognize object categories is fundamental to human intelligence. This paper proposes a behavior-grounded relational classification model that allows a robot to recognize the categories of household objects. In the proposed approach, the robot initially explores the objects by applying five exploratory behaviors (lift, shake, drop, crush and push) on them while recording the proprioceptive and auditory sensory feedback produced by each interaction. The sensorimotor data is used to estimate multiple measures of similarity between the objects, each corresponding to a specific coupling between an exploratory behavior and a sensory modality. A graph-based recognition model is trained by extracting features from the estimated similarity relations, allowing the robot to recognize the category memberships of a novel object based on the object's similarity to the set of familiar objects. The framework was evaluated on an upper-torso humanoid robot with two large sets of household objects. The results show that the robot's model is able to recognize complex object categories (e.g., metal objects, empty bottles, etc.) significantly better than chance.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2020 PaperCept, Inc.
Page generated 2020-04-02  12:48:35 PST  Terms of use