ICRA 2012 Paper Abstract


Paper WeD210.2

Sivalingam, Ravishankar (University of Minnesota), Somasundaram, Guruprasad (UMN), Bhatawadekar, Vineet (University of Minnesota), Morellas, Vassilios (U. of Minnesota), Papanikolopoulos, Nikos (University of Minnesota)

Sparse Representation of Point Trajectories for Action Classification

Scheduled for presentation during the Interactive Session "Interactive Session WeD-2" (WeD210), Wednesday, May 16, 2012, 17:00−17:30, Ballroom D

2012 IEEE International Conference on Robotics and Automation, May 14-18, 2012, RiverCentre, Saint Paul, Minnesota, USA

This information is tentative and subject to change. Compiled on June 20, 2018

Keywords Gesture, Posture and Facial Expressions, Computer Vision for Robotics and Automation, Surveillance Systems


Action classification is an important component of human-computer interaction. Trajectory classification is an effective way of performing action recognition with significant success reported in the literature. We compare two different representation schemes, raw multivariate time-series data and the covariance descriptors of the trajectories, and apply sparse representation techniques for classifying the various actions. The features are sparse coded using the Orthogonal Matching Pursuit algorithm, and the gestures and actions are classified based on the reconstruction residuals. We demonstrate the performance of our approach on standardized datasets such as the Australian Sign Language (AusLan) and UCF Motion Capture datasets, collected using high-quality motion capture systems, as well as motion capture data obtained from a Microsoft Kinect sensor.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2018 PaperCept, Inc.
Page generated 2018-06-20  23:47:25 PST  Terms of use