ICRA 2011 Paper Abstract

Close

Paper ThA114.4

Miller, Stephen (University of Califonia at Berkeley), Fritz, Mario (ICSI & UC Berkeley), Darrell, Trevor (UC Berkeley), Abbeel, Pieter (UC Berkeley)

Parametrized Shape Models for Clothing

Scheduled for presentation during the Regular Sessions "Computer Vision I: Model" (ThA114), Thursday, May 12, 2011, 09:05−09:20, Room 5J

2011 IEEE International Conference on Robotics and Automation, May 9-13, 2011, Shanghai International Conference Center, Shanghai, China

This information is tentative and subject to change. Compiled on August 19, 2019

Keywords Computer Vision for Robotics and Automation, Personal Robots

Abstract

We consider the problem of recognizing the configuration of clothing articles when crudely spread out on a flat surface, prior to and during folding. At the core of our approach are parametrized shape models for clothing articles. Each clothing category has its own shape model, and the variety in shapes for a given category is achieved through variation of the parameters. We present an efficient algorithm to find the parameters that provide the best fit when given an image of a clothing article. The models are such that, once the parameters have been fit, they provide a basic parse of the clothing article, allowing it to be followed by autonomous folding from category level specifications of fold sequences. Our approach is also able to recover the configuration of a clothing article when folds are being introduced---an important feature towards closing the perception-action loop. Additionally, our approach provides a reliable method of shape-based classification, simply by examining which model yields the best fit. Our experiments illustrate the effectiveness of our approach on a large set of clothing articles. Furthermore, we present an end-to-end system, which starts from an unknown spread-out clothing article, performs a parametrized model fit, then follows a category-level (rather than article specific) set of folding instructions, closing the loop with perceptual feedback by re-fitting between folds.

 

 

Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2019 PaperCept, Inc.
Page generated 2019-08-19  19:58:28 PST  Terms of use