Being able to manipulate virtual fingers, or even fingers attached to a functioning prosthetic device, is not the same as feeling like the device is part of your own body. Researchers at Arizona State University (ASU) have set out to change that. Bradley Greger, associate professor of biomedical engineering at ASU’s Ira A. Fulton Schools of Engineering, explains that having “super amazing” robotic limbs isn’t enough. “The hard part is the interface, getting the prosthetics to talk to the nerves,” he says. “It’s not just telling the fingers to move, the brain has to know the fingers have moved as directed.”

Fig. 1 – Markers are applied to the patient’s functioning hand in order to measure hand posture. These measurements are used to position the virtual hand. (Credit: Kevin O’Neill)

Greger’s team, which included researchers from ASU and the University of Utah, worked to establish bidirectional communication between a user and a new prosthetic limb that is capable of controlling more than 20 different movements. In the nervous system there is a “closed loop” of sensation, decision, and action. This process is carried out by a variety of sensory and motor neurons, along with interneurons, which enable communication with the central nervous system.

“Imagine the kind of neural computation it takes to perform what most would consider the simple act of typing on your computer,” says Greger. “We’re moving the dial toward that level of control.”

The team’s published study involved implanting an array of 96 electrodes for 30 days into the median and ulnar nerves in the arms of two amputees. The electrodes were stimulated both individually and in groups with varying degrees of amplitude and frequency designed to determine how the participants could perceive the stimulation.

Neural activity was recorded during intended movements of the subjects’ phantom fingers, and 13 specific movements were decoded as the subjects controlled the individual fingers of a virtual robotic hand.

The motor and sensory information provided by the implanted microelectrode arrays indicate that patients outfitted with a highly dexterous prosthetic limb controlled with a similar, bi-directional, peripheral nerve interface might begin to think of the prosthesis as an extension of themselves rather than a piece of hardware, explains Greger.

Fig. 2 – Denise Oswalt, a bioengineering doctoral student in the Neural Engineering Lab, demonstrates how patients will use the Oculus Rift headset to learn controlled movement of phantom fingers. (Credit: Kevin O’Neill)

“We’re now at the stage in this process where we ask patients to mirror movements between hands,” explains Greger. “We can’t record what the amputated hand is doing, but we can record what a healthy hand is doing.” (See Figure 1) Asking the patient to wave both hands simultaneously, for example, or to point at an object with both hands, will be integral to the latest technology employed in the feedback loop—an Oculus Rift virtual reality headset. The advantage of the virtual reality headset is that the patient is able to interact directly with his or her virtual limb rather than by watching it on a screen. (See Figure 2)

Kevin O’Neill, who was a bioengineer undergraduate on the research team at Utah and now is a doctoral student at ASU, is developing the technology that not only allows the patient to see what his or her virtual limb is doing, but also “decodes” the neural messages that enable the motion to happen.

“At first, when patients are learning to manipulate their virtual hands, they will be asked to strictly mirror movements of a healthy hand,” explained O’Neill. “Once we have learned what information the signals contain, we can build a neural decoding system and have patients drive the virtual representation of a missing limb independently of a healthy hand.”

In the video, an amputee is controlling a virtual prosthetic hand by thinking about moving the amputated hand, and the nerve signals are recorded by microelectrodes. A computer algorithm decodes the signals and controls the virtual prosthetic hand. Sensations of touch on the amputated hand were also generated in patients by electrically stimulating sensory nerve fibers using an implanted microelectrode array.

The next steps are getting the technology into human trials and then creating effective limbs that are available to patients at an affordable price. “There are prosthetic limbs that are amazing, but the costs can be in the hundred-thousand-dollar range,” he says. “We’re working toward limbs that are accessible both financially and in terms of usability. We want to create limbs that patients will use as true extensions of themselves.”

For more information, visit http://engineering.asu.edu .