AUTHOR=Perry Briana N. , Armiger Robert S. , Yu Kristin E. , Alattar Ali A. , Moran Courtney W. , Wolde Mikias , McFarland Kayla , Pasquina Paul F. , Tsao Jack W. TITLE=Virtual Integration Environment as an Advanced Prosthetic Limb Training Platform JOURNAL=Frontiers in Neurology VOLUME=9 YEAR=2018 URL=https://www.frontiersin.org/journals/neurology/articles/10.3389/fneur.2018.00785 DOI=10.3389/fneur.2018.00785 ISSN=1664-2295 ABSTRACT=

Background: Despite advances in prosthetic development and neurorehabilitation, individuals with upper extremity (UE) loss continue to face functional and psychosocial challenges following amputation. Recent advanced myoelectric prostheses offer intuitive control over multiple, simultaneous degrees of motion and promise sensory feedback integration, but require complex training to effectively manipulate. We explored whether a virtual reality simulator could be used to teach dexterous prosthetic control paradigms to individuals with UE loss.

Methods: Thirteen active-duty military personnel with UE loss (14 limbs) completed twenty, 30-min passive motor training sessions over 1–2 months. Participants were asked to follow the motions of a virtual avatar using residual and phantom limbs, and electrical activity from the residual limb was recorded using surface electromyography. Eight participants (nine limbs), also completed twenty, 30-min active motor training sessions. Participants controlled a virtual avatar through three motion sets of increasing complexity (Basic, Advanced, and Digit) and were scored on how accurately they performed requested motions. Score trajectory was assessed as a function of time using longitudinal mixed effects linear regression.

Results: Mean classification accuracy for passive motor training was 43.8 ± 10.7% (14 limbs, 277 passive sessions). In active motor sessions, >95% classification accuracy (which we used as the threshold for prosthetic acceptance) was achieved by all participants for Basic sets and by 50% of participants in Advanced and Digit sets. Significant improvement in active motor scores over time was observed in Basic and Advanced sets (per additional session: β-coefficient 0.125, p = 0.022; β-coefficient 0.45, p = 0.001, respectively), and trended toward significance for Digit sets (β-coefficient 0.594, p = 0.077).

Conclusions: These results offer robust evidence that a virtual reality training platform can be used to quickly and efficiently train individuals with UE loss to operate advanced prosthetic control paradigms. Participants can be trained to generate muscle contraction patterns in residual limbs that are interpreted with high accuracy by computer software as distinct active motion commands. These results support the potential viability of advanced myoelectric prostheses relying on pattern recognition feedback or similar controls systems.