Surface electromyography (sEMG) signals represent a promising approach for decoding the motor intention of amputees to control a multifunctional prosthetic hand in a non-invasive way. Several approaches based on proportional amplitude methods or simple thresholds on sEMG signals have been proposed to control a single degree of freedom at time, without the possibility of increasing the number of controllable multiple DoFs in a natural manner. Myoelectric control based on PR techniques have been introduced to add multiple DoFs by keeping low the number of electrodes and allowing the discrimination of different muscular patterns for each class of motion. However, the use of PR algorithms to simultaneously decode both gestures and forces has never been studied deeply. This paper introduces a hierarchical classification approach with the aim to assess the desired hand/wrist gestures, as well as the desired force levels to exert during grasping tasks. A Finite State Machine was introduced to manage and coordinate three classifiers based on the Non-Linear Logistic Regression algorithm. The classification architecture was evaluated across 31 healthy subjects. The “hand/wrist gestures classifier,” introduced for the discrimination of seven hand/wrist gestures, presented a mean classification accuracy of 98.78%, while the “Spherical and Tip force classifier,” created for the identification of three force levels, reached an average accuracy of 98.80 and 96.09%, respectively. These results were confirmed by Linear Discriminant Analysis (LDA) with time domain features extraction, considered as ground truth for the final validation of the performed analysis. A Wilcoxon Signed-Rank test was carried out for the statistical analysis of comparison between NLR and LDA and statistical significance was considered at p < 0.05. The comparative analysis reports not statistically significant differences in terms of F1Score performance between NLR and LDA. Thus, this study reveals that the use of non-linear classification algorithm, as NLR, is as much suitable as the benchmark LDA classifier for implementing an EMG pattern recognition system, able both to decode hand/wrist gestures and to associate different performed force levels to grasping actions.
This paper introduces a novel control framework for an arm exoskeleton that takes into account force of the human arm. In contrast to the conventional exoskeleton controllers where the assistance is provided without considering the human arm biomechanical force manipulability properties, we propose a control approach based on the arm muscular manipulability. The proposed control framework essentially reshapes the anisotropic force manipulability into the endpoint force manipulability that is invariant with respect to the direction in the entire workspace of the arm. This allows users of the exoskeleton to perform tasks effectively in the whole range of the workspace, even in areas that are normally unsuitable due to the low force manipulability of the human arm. We evaluated the proposed control framework with real robot experiments where subjects wearing an arm exoskeleton were asked to move a weight between several locations. The results show that the proposed control framework does not affect the normal movement behavior of the users while effectively reduces user effort in the area of low manipulability. Particularly, the proposed approach augments the human arm force manipulability to execute tasks equally well in the entire workspace of the arm.