ASU researchers help refine next-generation prosthetic handsPosted: Updated:
Inside a lab complete with motion-capture cameras, pincer-looking robotic arms, and a motorized hand, Arizona State University researcher Marco Santello explains that the goal of next-generation prosthetic limbs isn’t just finer movement. It’s also about creating artificial sensation.
“One of the challenges is to try to design or combine artificial feedback that you can connect to this prosthetic hand and then deliver this information in a meaningful way to the user,” Santello says.
Behind him, Assistant Research Professor Qiushi Fu is picking up a water bottle with a robotic prosthesis called SoftHand Pro that can read the electrical signals is his forearm. When Fu opens the fingers in his real hand, the rubber-coated prosthetic fingers open a second later with the whir of a motor. When his hand closes, the prosthetic closes.
Amputees with part of their forearm intact can also use the device.
“One of the complaints of prosthetic users is that they need to rely on visual feedback because there’s no sensation,” Santello explains.
Santello’s lab, the Neural Control of Movement Laboratory in the School of Biological and Health Systems Engineering, is experimenting with several types of artificial sensation that don’t require surgery. One concept is an arm band that can squeeze a patient – in this case Fu – as he picks up objects. Researchers can program the amount of squeeze to signal how stiff an object is.
“This feedback gives the individual a sensation of force that they apply on the object. We're now testing whether this feedback allows the individual to have more dexterous control and manipulation,” Santello says.
As Santello and his team demonstrate another form of artificial feedback in the lab, neuroscientists, engineers and clinicians from 10 universities are swapping feedback of their own at the 5th annual Rehabilitation Robotics Workshop a short walk away.
This year’s two-day event drew a record 347 attendees and speakers from Imperial College London, the Massachusetts Institute of Technology, the University of Kansas and others.
Next to a sea of research posters, I met another ASU neural engineer, Bradley Greger. Unlike Santello’s non-invasive approach, Greger is working on prosthetics operated by micro-electrodes surgically implanted in an amputee’s forearm.
“They literally just have to think, ‘move my finger.’ Just like you would. And we'll translate that into control of a prosthetic limb,” Greger tells me.
The same micro-electrodes can stimulate the patient’s when he touches an object “to have some sort of restored sense of touch.”
“The idea will be to have sensors in the tips of the [prosthetic] hand that when they touch something, it will give a little stimulation – a little electrical impulse to the nerve,” Greger says. “Then they will just feel it as a little touch or sensation on the tip of the finger.”
However, at the moment, Greger’s patients don’t actually move prosthetic limbs with their thoughts, or simulate feelings of touch by picking up real objects. They move virtual prosthetics, projected on a screen.
By studying the virtual data patients generate from these implanted electrodes, Greger says programming a physical device will be simple.
“Once we kind of understand it, it will be very easy to go into a real prosthetic limb,” he says.
His team let me on virtual reality goggles that are designed to replace an amputee’s missing limb with a virtual one, and help them reactivate neural activity before the electrodes are implanted.
Greger expects the technology will go from virtual to reality “in the next several years.”
“We know it can be done. We just need to work out the engineering details,” he says.
Copyright 2017 KPHO/KTVK (KPHO Broadcasting Corporation). All rights reserved.