A new generation of prosthetic devices allows patients to control them with their thoughts.
Dennis Aabo Sørensen tests a prosthetic arm with sensory feedback in a laboratory in Rome in March 2013. PHOTOGRAPH BY PATRIZIA TOCCI, LIFEHAND 2
Something is missing. Every amputee knows it, and it is more than the arm or leg they have lost. They can get replacements for those limbs: substitutes made from metal and plastic, controlled by advanced computer chips, with the ability to grip, to turn, to step. On the outside the limbs can appear lifelike, and on the inside they are amazing machines.
But they are tools, not part of the patients themselves. They have no sensitivity, and no instant response to a patient’s intentions.
Because of that lack of feeling and control, says Dennis Aabo Sørensen, a 36-year-old from Denmark who lost his left hand in a fireworks explosion nearly a decade ago, he could tell what he was touching with his prosthetic hand only by looking at it.
Now, for Sørensen and other amputees, all that is changing. Earlier this month, scientists announced they had wired pressure sensors in the fingers of an artificial hand to sensory nerves in Sørensen’s upper arm. He grabbed a block, and his nerves tingled. “I could feel round things and soft things and hard things,” he says. “It’s so amazing to feel something that you haven’t been able to feel for so many years.” (See “Boston Bombing Amputees Face Lengthy Recovery.”)
This is more than a psychological boost; experiments show that sensory feedback vastly improves a patient’s ability to control a prosthetic, even to the point of picking stems off of fruit.
PHOTOGRAPH BY BRAIN KERSEY, AP Zac Vawter, shown here at the Willis Tower in Chicago in October 2012, was the first person to climb 103 flights of stairs wearing a prosthesis controlled with his mind.
Sørensen’s case is just one of several efforts under way to endow artificial limbs with real feeling. Researchers at Case Western Reserve University in Cleveland, Ohio, have also restored sensation through an artificial hand, transmitting differences not just of pressure but of texture, too.
At the Rehabilitation Institute of Chicago, scientists last fall breached one of the biggest barriers in prosthetics, creating a thought-controlled leg that can climb stairs and go from sitting to standing in response to signals from nerves in a patient’s stump. Researchers have also developed something called pattern recognition, in which the prosthetics can “learn” to interpret nerve signals from patients in real time, responding directly to intentions.
“We can get people to view the limb as if it actually belongs to them,” says Paul Marasco, a neuroscientist at the Cleveland Clinic’s Lerner Research Institute, who works on sensation and prosthetics. “We’re getting a critical mass of research, not just results here and there, so I think this is really going to happen in five to ten years.”
The work is still experimental, the scientists hasten to emphasize, and other advances need to occur before these prosthetics are ready for daily use. For one thing, the connections need to become wireless instead of wired, because nobody is going home with wires sticking through their skin. “But when this becomes perfected, it will be huge,” says Robert Lipschutz, a prosthetist at the Rehabilitation Institute of Chicago.
PHOTOGRAPH BY BRIAN KERSEY, AP Zac Vawter practices walking with his experimental “bionic” leg at the Rehabilitation Institute of Chicago in October 2012.
Sørensen already had an artificial hand that opened and closed in response to muscle contractions in his stump. To add sensory feedback, Silvestro Micera and colleagues at the Scuola Superiore Sant’Anna in Italy and the École Polytechnique Fédérale de Lausanne in Switzerland added sensors to mechanical tendons in the prosthetic’s fingers. The sensors generated electrical signals as the tendons pushed on an object. Those signals were fed to a computer that relayed them through wires that went into Sørensen’s skin. The wires led to electrodes on the sensory nerves in his stump that formerly ended in his hand.
For a month, Sørensen went through a gauntlet of tasks designed to simulate the challenges of daily living—reaching, turning, squeezing, pinching—things that two-handed people do without much thought, but Sorensen hadn’t done in nine years. In response, he felt different tingling sensations, depending on the amount of pressure he needed to apply to hold the object. In this way, he gradually came to associate the tingles with different qualities, such as hardness, softness, and roundness.
At the same time that Micera and his colleagues were working with Sørensen, Dustin Tyler, of Case Western Reserve and the Cleveland VA Medical Center, and his team were developing a similar system. At a scientific meeting in November 2013, Tyler reported success in two patients. And not for a month, but for a year.
“The longevity proves this system is really stable,” he says. “And we’ve placed electrodes at eight different places on the patients’ nerves. One patient can feel sensation from eight distinct places on the hand: the thumb, some fingers, the back of the hand, and the palm. We can adjust the size of these spots by adjusting the signal. So we pretty much have restoration over the entire hand.”
Receiving sensory feedback from an object is a game-changer in the patient’s relationship to the world. Without such touch, for example, amputees have to watch their prosthetic hands to see if they are gripping a paper cup too hard—often too late to prevent a spill. Some can gain modest control by listening to the sound of the motors in the prosthetic, which changes as they encounter resistance.
But with direct sensory feedback, Tyler found, “our patients can twist stems off of cherries. That was really something to see. If we turned off the touch, they would grasp too hard and crush the fruit. Or they would grip too softly and the stems would slip through their fingers.”
Tyler’s group is currently working on altering the electrical signals generated by the finger sensors to allow patients to experience different textures, like rough and smooth, in addition to pressure. “We think we can get a whole suite of sensation,” Tyler says. “The brain wants that. It’s looking for it.”
Eventually, he thinks the system will be fully implanted under the skin and will communicate with the artificial limb wirelessly, like a Bluetooth headset for your cell phone.
To further enhance the sense of a prosthetic as an organic part of its wearer, doctors and engineers at the Rehabilitation Institute of Chicago have developed artificial limbs that respond seamlessly to the patient’s own thoughts.
Computer chips inside the prosthetic are connected to sensors that pick up motor signals from nerves in a patient’s stump that formerly commanded a hand to open, for instance, or a wrist to twist. No wires penetrate the skin. Using a method called targeted muscle reinnervation, the nerves have first been rerouted by a surgeon into large muscles at the end of the stump. Muscles are electrically active: They act as amplifiers for the nerve signals, strengthening them enough for the prosthetic sensors to pick them up and send them on to motors that move the hand.
Until recently, these signals have been interpreted one at a time—turn wrist, then open hand—resulting in motions lacking the fluid, connected motion of a real arm. Now, more refined computer algorithms in the chips recognize patterns, linking one movement to the next. The results are smoother and require less conscious planning; a patient merely has to think, and the limb moves.
Kitts’ arm is inked with target points where wires will be attached;
the wires will detect her nerve signals and map them to a computer so
she can control the prosthetic with her brain..
Amanda Kitts, an arm amputee who lives in Florida (and who was featured in a January 2010 National Geographic cover story on bionics), simply slips on her arm in the morning. “I do a wrist flexion, a rotation, a few other things, and I’m good to go,” she says. “I used to have to think much more about what I am doing.”
This technology has now been extended to one of the thorniest problems in prosthetics: getting a leg to stand up. The position of the knee and ankle when a person is sitting, and the loads the joints bear, is radically different from the position and load when walking. Artificial leg makers have resorted to putting a manual switch onto their prosthetics that users have to hit to shift between sitting and upright positions—a long way from motion controlled by thought.
Last fall, however, Levi Hargrove, director of neural engineering at the Rehabilitation Institute of Chicago, gave amputees a thinking leg to stand on —and even climb stairs. Robotic sensors detect speed changes, orientation, and weight and feed the information to onboard computer chips, helping the leg respond to differences in terrain. To avoid the awkward mechanical switch and allow the limb to respond instead to the user’s intent, motor nerves formerly controlling ankle movements have been rerouted into thigh muscles, taking advantage of the fact that thigh muscles fire during normal walking when ankle muscles move.
“We’re able to make use of those patterns, so the thigh muscle signals predict what the ankle muscle will do,” Hargrove says. Tension at the thigh, for instance, sends a signal for the knee to straighten and the ankle to move the foot perpendicular to the leg: in other words, to stand up. Pattern recognition software smooths the motions so the patient doesn’t fall down. A different degree of tension in the thigh flexes the leg to step up a stair.
Prosthetics are taking a huge step forward. This time, with feeling.
Josh Fischmann for National Geographic
February 22, 2014