Researchers at the Georgia Institute of Technology developed an ultrasonic sensor using GPUs and deep learning that allows amputees to control individual fingers on a prosthetic hand.
“Our prosthetic arm is powered by ultrasound signals,” said Gil Weinberg, the Georgia Tech College of Design professor who leads the project. “By using this new technology, the arm can detect which fingers an amputee wants to move, even if they don’t have fingers.”
Five years ago, aspiring musician Jason Barnes was electrocuted during a work accident, forcing doctors to amputate his right arm just below the elbow. Barnes was the first amputee to test the prosthetic arm.
“It’s completely mind-blowing,” said Barnes. “This new arm allows me to do whatever grip I want, on the fly, without changing modes or pressing a button. I never thought we’d be able to do this.”
Similar devices available on the market rely on electromyogram (EMG) sensors attached to muscles.
“EMG sensors aren’t very accurate,” said Weinberg, director of Georgia Tech’s Center for Music Technology. “They can detect a muscle movement, but the signal is too noisy to infer which finger the person wants to move. We tried to improve the pattern detection from EMG for Jason but couldn’t get finger-by-finger control.”
This is when the team decided to experiment with an ultrasound machine and deep learning.
By attaching an ultrasound probe to the arm, and using a TITAN X GPU with the cuDNN-accelerated TensorFlow deep learning framework, the researcher’s algorithm is able to analyze and detect muscle movements to predict what finger the patient is trying to use.
“If this type of arm can work on music, something as subtle and expressive as playing the piano, this technology can also be used for many other types of fine motor activities such as bathing, grooming and feeding,” said Weinberg. “I also envision able-bodied persons being able to remotely control robotic arms and hands by simply moving their fingers.”