ICRA'09 Paper Abstract


Paper FrB12.5

Kim, Dae-Jin (University of Central Florida), Behal, Aman (University of Central Florida), Lovelett, Ryan (UCF)

Eye-In-Hand Stereo Visual Servoing of an Assistive Robot Arm in Unstructured Environments

Scheduled for presentation during the Regular Sessions "Rehabilitation Robotics - II" (FrB12), Friday, May 15, 2009, 11:50−12:10, Room: 504

2009 IEEE International Conference on Robotics and Automation, May 12 - 17, 2009, Kobe, Japan

This information is tentative and subject to change. Compiled on January 24, 2022

Keywords Rehabilitation Robotics, Visual Servoing, Computer Vision for Robotics and Automation


We document the progress in the design and implementation of a motion control strategy that exploits visual feedback from a narrow baseline stereo head mounted in the hand of a wheelchair mounted robot arm (WMRA) to recognize and grasp textured ADL objects for which one or more templates exist in a large image database. The problem is made challenging by kinematic uncertainty in the robot, imperfect camera and stereo calibration, as well as the fact that we work in unstructured environments. The approach relies on separating the overall motion into gross and fine motion components. During the gross motion phase, local structure on an object around a user selected point of interest (POI) is extracted using sparse stereo information which is then utilized to converge on and roughly align the object with the image plane in order to be able to pursue object recognition and fine motion with strong likelihood of success. Fine motion is utilized to grasp the target object by relying on feature correspondences between the live object view and its template image. While features are detected using a robust real-time keypoint tracker, a hybrid visual servoing technique is exploited in which tracked pixel space features are utilized to generate translational motion commands while a Euclidean homography decomposition scheme is utilized for generation of orientation setpoints for the robot gripper. Experimental results are presented to demonstrate the efficacy of the propose algorithm.



Technical Content © IEEE Robotics & Automation Society

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2022 PaperCept, Inc.
Page generated 2022-01-24  07:02:31 PST  Terms of use