IROS 2015 Paper Abstract


Paper ThAT1.3

Xiong, Anbin (Shenyang Institute of Automation, Chinese Academy of Sciences), Zhao, Xingang (Shenyang Institute of Automation,Chinese Academy ofSciences), Han, Jianda (Shenyang Institute of Automation, Chinese AcademyofSciences), Liu, Guangjun (Ryerson University), Ding, Qichuan (Shenyang Institute of Automation Chinese Academy ofSciences)

An User-Independent Gesture Recognition Method Based on Semg Decomposition

Scheduled for presentation during the Regular session "Cognitive Human-Robot Interaction" (ThAT1), Thursday, October 1, 2015, 09:00−09:15, Saal A1

2015 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sept 28 - Oct 03, 2015, Congress Center Hamburg, Hamburg, Germany

This information is tentative and subject to change. Compiled on July 19, 2019

Keywords Cognitive Human-Robot Interaction, Human-Robot Interaction, Rehabilitation Robotics


sEMG recognition has been used extensively in prosthetic device control, human-assisting manipulators and sign language recognition, etc. However, the sEMG recognition model, trained with one subjectís sEMG data, is not applicable to the other subjects, which hinders the practical application of myoelectric interfaces immensely. In this paper, a sEMG recognition method which is applicable to multi-users is proposed. Firstly, single channel sEMG is decomposed into 30 MUAPTs, which includes four steps: two-order differential filter, threshold calculation, spike detection and hierarchical clustering. Secondly, the MUAPTs are updated with the templates orthogonalization; and Deep Boltzman Machine is employed to classify the MUAPTs into five classes corresponding to the predefined five gestures. Six participants participated in this experiment to validate the effectiveness of the proposed method. Results indicated that this method can achieve a mean accuracy of 81.5%.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2019 PaperCept, Inc.
Page generated 2019-07-19  14:25:58 PST  Terms of use