AUTHOR=Li Ke , Li Zhengzhen , Zeng Haibin , Wei Na TITLE=Control of Newly-Designed Wearable Robotic Hand Exoskeleton Based on Surface Electromyographic Signals JOURNAL=Frontiers in Neurorobotics VOLUME=Volume 15 - 2021 YEAR=2021 URL=https://www.frontiersin.org/journals/neurorobotics/articles/10.3389/fnbot.2021.711047 DOI=10.3389/fnbot.2021.711047 ISSN=1662-5218 ABSTRACT=Human hand plays a role in variety of daily activities. This intricate instrument is vulnerable to trauma or neuromuscular disorders. Wearable robotic exoskeleton is an advanced technology which may remarkably promote the recovery of hand function, but is still facing the persistent challenges in mechanical and functional integration, and real-time control of the multi-actuators in accordance with the changing motion intention. In this study, we demonstrated a newly-designed wearable robotic hand exoskeleton with multi-joints, more degrees of freedom and larger range of motion. The exoskeleton hand comprises 6 linear actuators (2 for the thumb and the other 4 for the fingers) and can realize both independent movement of each digit and coordinative movement involving multiple fingers for grasp and pinch. The kinematic parameters of the hand exoskeleton were analyzed by motion capture system. The exoskeleton showed higher range of motion of the proximal interphalangeal and distal interphalangeal joints compared with the other exoskeletons. Five classifiers including support vector machine (SVM), K-near neighbor (KNN), decision tree (DT), multi-layer perceptron (MLP), multi-channel convolutional neural networks (multi-channel CNN) were compared for the offline classification. The SVM and KNN had the higher accuracy than the others, reaching up to 99%. For the online classification, 3 out of 5 subjects showed an accuracy about 80%, and one subject showed an accuracy over 90%. These results suggest that the new wearable exoskeleton could facilitate hand rehabilitation for larger range of motion and higher dexterity, and could be controlled according to subjects’ motion intention.