Researchers have created a prosthetic hand that offers its users the ability to feel where it is and how the fingers are positioned â€” a sense known as proprioception. The headline may be in jest, but the advance is real and may help amputees more effectively and naturally use their prostheses.
Prosthesis rejection is a real problem for amputees, and many choose to simply live without these devices, electronic or mechanical, as they can complicate as much as they simplify. Part of that is the simple fact that, unlike their natural limbs, artificial ones have no real sensation â€” or if there is any, it’s nowhere near the level someone had before.
Touch and temperature detection are important, of course, but what’s even more critical to ordinary use is simply knowing where your limb is and what it’s doing. If you close your eyes, you can tell where each digit is, how many you’re holding up, whether they’re gripping a small or large object and so on. That’s currently impossible with a prosthesis, even one that’s been integrated with the nervous system to provide feedback â€” meaning users have to watch what they’re doing at all times. (That is, if the arm isn’t watching for you.)
This prosthesis, built by Swiss, Italian and German neurologists and engineers, is described in a recent issue of Science Robotics. It takes the existing concept of sending touch information to the brain through electrodes patched into the nerves of the arm, and adapts it to provide real-time proprioceptive feedback.
“Our study shows that sensory substitution based on intraneural stimulation can deliver both position feedback and tactile feedback simultaneously and in real time. The brain has no problem combining this information, and patients can process both types in real time with excellent results,” explained Silvestro Micera, of the Ã‰cole Polytechnique FÃ©dÃ©rale de Lausanne, in a news release.
It’s been the work of a decade to engineer and demonstrate this possibility, which could be of enormous benefit. Having a natural, intuitive understanding of the position of your hand, arm or leg would likely make prostheses much more useful and comfortable for their users.
Essentially the robotic hand relays its telemetry to the brain through the nerve pathways that would normally be bringing touch to that area. Unfortunately it’s rather difficult to actually recreate the proprioceptive pathways, so the team used what’s called sensory substitution instead. This uses other pathways, like ordinary touch, as ways to present different sense modalities.
A simple example would be a machine that touched your arm in a different location depending on where your hand is. In the case of this research it’s much finer, but still essentially presenting position data as touch data. It sounds weird, but our brains are actually really good at adapting to this kind of thing.
As evidence, witness that after some training two amputees using the system were able to tell the difference between four differently shaped objects being grasped, with their eyes closed, with 75 percent accuracy. Chance would be 25 percent, of course, meaning the sensation of holding objects of different sizes came through loud and clear â€” clear enough for a prototype, anyway. Amazingly, the team was able to add actual touch feedback to the existing pathways and the users were not overly confused by it. So there’s precedent now for multi-modal sensory feedback from an artificial limb.
The study has well-defined limitations, such as the number and type of fingers it was able to relay information from, and the granularity and type of that data. And the “installation” process is still very invasive. But it’s pioneering work nevertheless: this type of research is very iterative and global, progressing by small steps until, all of a sudden, prosthetics as a science has made huge strides. And the people who use prosthetic limbs will be making strides, as well.
Source: Techcrunch Disrupt