Researchers use AI to improve “feel” for prosthetic hands

According to the researchers, the new technology provides key advantages over traditional sensors, including high conductivity, flexibility and stretchability, and it could provide a higher level of intelligence for artificial hands.
Jeff Rowe

With more than 3,000 touch receptors in each fingertip, humans rely heavily on “touch and feel,” to put it mildly. But the lack of that tactile sensitivity can be challenging for individuals with upper limb amputations and have been deprived of that pressured sense of touch.

Now, researchers from Florida Atlantic University's College of Engineering and Computer Science are using machine learning in an effort to create prosthetic hands with a more natural feeling prosthetic hand interface by incorporating stretchable tactile sensors using liquid metal on the fingertips of a prosthetic hand.

In a recent study, researchers used individual fingertips on the prosthesis to distinguish between different speeds of a sliding motion along four different textured surfaces with one variable parameter: the distance between the ridges. To detect the textures and speeds, researchers trained four machine learning algorithms. For each of the ten surfaces, 20 trials were collected to test the ability of the ML algorithms to distinguish between the ten different complex surfaces comprised of randomly generated permutations of four different textures.

“Significant research has been done on tactile sensors for artificial hands, but there is still a need for advances in lightweight, low-cost, robust multimodal tactile sensors," Erik Engeberg, PhD, senior author, an associate professor in the Department of Ocean and Mechanical Engineering said in a press release. "The tactile information from all the individual fingertips in our study provided the foundation for a higher hand-level of perception enabling the distinction between ten complex, multi-textured surfaces that would not have been possible using purely local information from an individual fingertip. We believe that these tactile details could be useful in the future to afford a more realistic experience for prosthetic hand users through an advanced haptic display, which could enrich the amputee-prosthesis interface and prevent amputees from abandoning their prosthetic hand.”

The results demonstrated that the tactile information from the liquid metal sensors were able to differentiate between the multi-textured surfaces, demonstrating a new form of hierarchical intelligence. Additionally, the ML algorithms were able to distinguish between all the speeds with high accuracy.

"The loss of an upper limb can be a daunting challenge for an individual who is trying to seamlessly engage in regular activities," said Stella Batalama, PhD, dean, College of Engineering and Computer Science. "Although advances in prosthetic limbs have been beneficial and allow amputees to better perform their daily duties, they do not provide them with sensory information such as touch. They also don't enable them to control the prosthetic limb naturally with their minds. With this latest technology from our research team, we are one step closer to providing people all over the world with a more natural prosthetic device that can 'feel' and respond to its environment.”

Photo by posteriori/Getty Images