Low-Cost Prosthetic Control via Sparse EMG Gesture Classification
Low-Cost Prosthetic Control via Sparse EMG Gesture Classification
The following is the abstract of the paper to be published at the Frontiers of Artificial Intelligence & Applications in Nov 2025 by IOS Press:
"An estimated 22.3 million people worldwide had upper-limb amputations due to traumatic causes in 2017, representing nearly half of all such amputees. The need for advanced EMG-based prosthetic arms is growing; however, these devices typically rely on high-density electrode arrays and commercial-grade equipment, significantly limiting accessibility due to their high cost and maintenance requirements.
Methods: Here, we show that affordable prosthetic control can be achieved using a limited number of electrodes from low-cost, commercial EMG hardware. Specifically, we used the Bitalino MuscleBIT™ EMG acquisition system to control a 3D-printed prosthetic hand adapted from the open-source e-NABLE Phoenix V3 design. We trained a lightweight multilayer perceptron that classified five distinct hand gestures from EMG signals recorded from three participants across three sessions each.
Results: The system successfully classified five distinct hand gestures with 94.0% classification accuracy within subject across sessions on average and 83.3% classification accuracy across subjects. We demonstrate how sparse electrode configurations—using just four electrodes instead of high-density arrays—combined with open-source machine learning algorithms can achieve practical gesture recognition performance.
Conclusions: This approach substantially reduces system cost and complexity while maintaining functional accuracy, enabling the development of low-cost EMG-controlled prosthetics for users unable to afford high-end commercial solutions. With a total system cost of under $300, this approach represents up to a 200x cost reduction compared to commercial myoelectric prostheses while maintaining comparable classification performance."
These are some slides that I am presenting at the Int'l Conf on Biomedical Engg & Biotechnology (ICBEB, Matsue, Japan) in Nov 2025.
Here's the high-level of our project and the link to our github with some sample patient data, the code for signal analysis, protocol alignment, decoding algorithim, and 3D-printed arm files: risray700/emg2gesture.
I designed a prosthetic arm by extending the e-NABLE Phoenix v3 design. The following flowchart shows the overall design of the system. This work was first presented at UC Davis Biotech challenge in April 2025 (Biotech Poster).
The printed arm houses 5 servo motors as well as a microcontroller circuit used for receive gesture control commands from a machine-learning algorithm, previously trained using sEMG data.
We collected sEMG signals using the Bitalino MuscleBit Sensors. We initially started by recording sEMG and hand positions simultaneously, the joint-angles for individual digits were extracted and tracked in real-time with MediaPipe: drive.google.link.