Researchers develop tech for improved prosthetic movement

Researchers at the North Carolina State University and the University of North Carolina at Chapel Hill have developed a technology capable of decoding neuromuscular signals to improve control of powered prosthetic wrists and hands.

Currently, prosthetics rely on machine learning to develop a “pattern recognition” to control. This allows users to “teach” the device how to recognize patterns in muscle movement and translate them into commands. But this process can be tedious and time consuming for users.

"We wanted to focus on what we already know about the human body," said Helen Huang, a professor in the joint biomedical engineering program at North Carolina State University and the University of North Carolina at Chapel Hill.  "That's because every time you change your posture, your neuromuscular signals for generating the same hand/wrist motion change. So relying solely on machine learning means teaching the device to do the same thing multiple times; once for each different posture, once for when you are sweaty versus when you are not, and so on. Our approach bypasses most of that."

This new technology decodes the neuromuscular signals with computer models that mimic the natural structures of the forearm, wrist and hand. To develop the technology, researchers placed electromyography sensors on the forearms of six individuals to track the neuromuscular signals being sent through the body. This data were then used to create a generic model that control the translation of signals into commands for the prosthetic.

"When someone loses a hand, their brain is networked as if the hand is still there," Huang said. "So, if someone wants to pick up a glass of water, the brain still sends those signals to the forearm. We use sensors to pick up those signals and then convey that data to a computer, where it is fed into a virtual musculoskeletal model. The model takes the place of the muscles, joints and bones, calculating the movements that would take place if the hand and wrist were still whole. It then conveys that data to the prosthetic wrist and hand, which perform the relevant movements in a coordinated way and in real time - more closely resembling fluid, natural motion.”