ICRA 2012 Paper Abstract


Paper WeD110.1

Libby, Jacqueline Kemeny (Carnegie Mellon University), Stentz, Anthony (Carnegie Mellon University)

Using Sound to Classify Vehicle-Terrain Interactions in Outdoor Environments

Scheduled for presentation during the Interactive Session "Interactive Session WeD-1" (WeD110), Wednesday, May 16, 2012, 16:30−17:00, Ballroom D

2012 IEEE International Conference on Robotics and Automation, May 14-18, 2012, RiverCentre, Saint Paul, Minnesota, USA

This information is tentative and subject to change. Compiled on June 18, 2018

Keywords Field Robots, Energy and Environment-aware Automation, Robot Safety


Robots that operate in complex physical environments can improve the accuracy of their perception systems by fusing data from complementary sensing modalities. Furthermore, robots capable of motion can physically interact with these environments, and then leverage the sensory information they receive from these interactions. This paper explores the use of sound data as a new type of sensing modality to classify vehicle-terrain interactions from mobile robots operating outdoors, which can complement more typical non-contact sensors that are used for terrain classification. Acoustic data from microphones was recorded on a mobile robot interacting with different types of terrains and objects in outdoor environments. This data was then labeled and used offline to train a supervised multiclass classifier that can distinguish between these interactions based on acoustic data alone. To the best of the author's knowledge, this is the first time that acoustics has been used to classify a variety of interactions that a vehicle can have with its environment, so part of our contribution is to survey acoustic techniques from other domains and explore their efficacy for this application. The feature extraction methods we implement are derived from this survey, which then serve as inputs to our classifier. The multiclass classifier is then built from Support Vector Machines (SVMs). The results presented show an average of 92% accuracy across all classes, which suggest strong potential for a



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2018 PaperCept, Inc.
Page generated 2018-06-18  00:49:06 PST  Terms of use