An adaptive mixed reality training system for stroke rehabilitation

Margaret Duff, Yinpeng Chen, Suneth Attygalle, Janice Herman, Hari Sundaram, Gang Qian, Jiping He, Thanassis Rikakis

Research output: Contribution to journalArticlepeer-review


This paper presents a novel mixed reality rehabilitation system used to help improve the reaching movements of people who have hemiparesis from stroke. The system provides real-time, multimodal, customizable, and adaptive feedback generated from the movement patterns of the subject's affected arm and torso during reaching to grasp. The feedback is provided via innovative visual and musical forms that present a stimulating, enriched environment in which to train the subjects and promote multimodal sensory-motor integration. A pilot study was conducted to test the system function, adaptation protocol and its feasibility for stroke rehabilitation. Three chronic stroke survivors underwent training using our system for six 75-min sessions over two weeks. After this relatively short time, all three subjects showed significant improvements in the movement parameters that were targeted during training. Improvements included faster and smoother reaches, increased joint coordination and reduced compensatory use of the torso and shoulder. The system was accepted by the subjects and shows promise as a useful tool for physical and occupational therapists to enhance stroke rehabilitation.

Original languageEnglish (US)
Article number5497185
Pages (from-to)531-541
Number of pages11
JournalIEEE Transactions on Neural Systems and Rehabilitation Engineering
Issue number5
StatePublished - Oct 2010
Externally publishedYes


  • Mixed reality
  • motion analysis
  • reach and grasp
  • stroke rehabilitation
  • upper extremity

ASJC Scopus subject areas

  • Rehabilitation
  • General Neuroscience
  • Internal Medicine
  • Biomedical Engineering


Dive into the research topics of 'An adaptive mixed reality training system for stroke rehabilitation'. Together they form a unique fingerprint.

Cite this