ICRA 2011 Paper Abstract

Close

Paper TuA1-InteracInterac.14

chuang, yuelong (Zhejiang University), chen, ling (zhejiang university), zhao, gangqiang (Nanyang Technological University), chen, gencai (zhejiang university)

Hand Posture Recognition and Tracking Based on Bag-Of-Words for Human Robot Interaction

Scheduled for presentation during the Poster Sessions "Interactive Session I: Robotic Technology" (TuA1-InteracInterac), Tuesday, May 10, 2011, 08:20−09:35, Hall

2011 IEEE International Conference on Robotics and Automation, May 9-13, 2011, Shanghai International Conference Center, Shanghai, China

This information is tentative and subject to change. Compiled on March 30, 2020

Keywords Computer Vision for Robotics and Automation, Human detection & tracking

Abstract

Hand posture is a natural and effective interaction between human and robot. In this paper, we use monocular camera as input device, and an improved Bag-of-Words (BoW) method is proposed to detect and recognize hand posture based on a new descriptor ARPD (Appearance and Relative Position Descriptor) and spectral embedding clustering algorithm. To track hand motion rapidly and accurately, we have designed a new framework based on improved BoW and CAMSHIFT algorithm. The thorough evaluation of our algorithm is presented to show its usefulness.

 

 

Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2020 PaperCept, Inc.
Page generated 2020-03-30  01:06:49 PST  Terms of use