Real-Time and Offline Evaluation of Myoelectric Pattern Recognition for the Decoding of Hand Movements
Pattern recognition algorithms have been widely used to map surface electromyographic signals to target movements as a source for prosthetic control. However, most investigations have been conducted offline by performing the analysis on pre-recorded datasets. While real-time data analysis (i.e., classification when new data becomes available, with limits on latency under 200–300 milliseconds) plays an important role in the control of prosthetics, less knowledge has been gained with respect to real-time performance. Recent literature has underscored the differences between offline classification accuracy, the most common performance metric, and the usability of upper limb prostheses. Therefore, a comparative offline and real-time performance analysis between common algorithms had yet to be performed. In this study, we investigated the offline and real-time performance of nine different classification algorithms, decoding ten individual hand and wrist movements. Surface myoelectric signals were recorded from fifteen able-bodied subjects while performing the ten movements. The offline decoding demonstrated that linear discriminant analysis (LDA) and maximum likelihood estimation (MLE) significantly (p < 0.05) outperformed other classifiers, with an average classification accuracy of above 97%. On the other hand, the real-time investigation revealed that, in addition to the LDA and MLE, multilayer perceptron also outperformed the other algorithms and achieved a classification accuracy and completion rate of above 68% and 69%, respectively.