ICRA 2011 Paper Abstract

Close

Paper TuP1-InteracInterac.35

Willimon, Bryan (Clemson University), Birchfield, Stan (Clemson University), Walker, Ian (Clemson University)

Classification of Clothing Using Interactive Perception

Scheduled for presentation during the Poster Sessions "Interactive Session II: Systems, Control and Automation" (TuP1-InteracInterac), Tuesday, May 10, 2011, 13:40−14:55, Hall

2011 IEEE International Conference on Robotics and Automation, May 9-13, 2011, Shanghai International Conference Center, Shanghai, China

This information is tentative and subject to change. Compiled on March 30, 2020

Keywords Computer Vision for Robotics and Automation, Motion Control of Manipulators, Recognition

Abstract

We present a system for automatically extracting and classifying items in a pile of laundry. Using only visual sensors, the robot identifies and extracts items sequentially from the pile. When an item has been removed and isolated, a model is captured of the shape and appearance of the object, which is then compared against a database of known items. The classification procedure relies upon silhouettes, edges, and other low-level image measurements of the articles of clothing. The contributions of this paper are a novel method for extracting articles of clothing from a pile of laundry and a novel method of classifying clothing using interactive perception. Experiments demonstrate the ability of the system to efficiently classify and label into one of six categories (pants, shorts, short-sleeve shirt, long-sleeve shirt, socks, or underwear). These results show that, on average, classification rates using robot interaction are 59% higher than those that do not use interaction.

 

 

Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2020 PaperCept, Inc.
Page generated 2020-03-30  00:43:51 PST  Terms of use