Automatic grasp selection using a camera in a hand prosthesis

Joseph Degol, Aadeel Akhtar, Bhargava Manja, Timothy Bretl

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we demonstrate how automatic grasp selection can be achieved by placing a camera in the palm of a prosthetic hand and training a convolutional neural network on images of objects with corresponding grasp labels. Our labeled dataset is built from common graspable objects curated from the ImageNet dataset and from images captured from our own camera that is placed in the hand. We achieve a grasp classification accuracy of 93.2% and show through realtime grasp selection that using a camera to augment current electromyography controlled prosthetic hands may be useful.

Original languageEnglish (US)
Title of host publication2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages431-434
Number of pages4
ISBN (Electronic)9781457702204
DOIs
StatePublished - Oct 13 2016
Event38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2016 - Orlando, United States
Duration: Aug 16 2016Aug 20 2016

Publication series

NameProceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS
Volume2016-October
ISSN (Print)1557-170X

Other

Other38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2016
Country/TerritoryUnited States
CityOrlando
Period8/16/168/20/16

ASJC Scopus subject areas

  • Signal Processing
  • Biomedical Engineering
  • Computer Vision and Pattern Recognition
  • Health Informatics

Fingerprint

Dive into the research topics of 'Automatic grasp selection using a camera in a hand prosthesis'. Together they form a unique fingerprint.

Cite this