BIOMEDICINE
Giving Prosthetics a Sense of Touch
A study gives a first demonstration that brain-machine interfaces can include touch feedback.
Brain-machine interfaces have made it possible for monkeys and some humans to control robotic limbs using just their thoughts. But ideally, a person using an artificial limb or other device would not only be able to control the device, but also feel what it's touching.
A new study from the lab of Miguel Nicolelis at Duke University Medical Center takes a first step toward such an interface. In a paper published today in Nature, his team reports that monkeys can learn to operate a virtual-reality hand that incorporates tactile feedback.
Nicolelis says that brain-machine interfaces will only be clinically useful if they use bidirectional signals, with both sensory feedback from the device and motor commands from the user. "It's not enough to just provide motion," he says. "You need to sense what you're touching."
As a first experiment, monkeys used a joystick to control a virtual "avatar" (a monkey arm and hand) on a computer screen, and were encouraged to use the avatar to grab objects on the screen. The virtual objects had textures, and this was conveyed using stimulation through microwave arrays implanted in a part of the brain's cortex responsible for sensing touch. The monkeys learned to hold the avatar's hand over objects with a particular texture—conveyed by the frequency of stimulation—in order to be rewarded with food.
In another experiment, the monkeys received the same tactile feedback but controlled the virtual hand using just their thoughts, via microwire arrays implanted in the motor cortex. Although their performance on the task was less accurate, the monkeys improved over time.
Nicolelis says the successful use of a "brain-machine-brain interface" demonstrates that the processes of sensing and responding to tactile sensations can be combined. "We are decoding motor intentions and tactile messages simultaneously," he says. "That's never been done before." Although the stimulation the monkeys receive is artificial, he says, they seem to learn to associate it with tactile information.
The next step is to incorporate the sense of touch into real prosthetics, using pressure sensors that will generate similar tactile feedback about real-world objects. Nicolelis says his group hopes to build a simulator that would test this approach in humans, then incorporate touch sensation in prosthetics it's creating for people with reduced mobility.
NitishThakor, a biomedical engineer at Johns Hopkins University, says that adding sensory information "is absolutely the next logical step" in brain-machine interface design. Thakor says the experiment not only demonstrates the feasibility of adding touch, but shows that the monkeys can learn a task using these coupled signals. The caveat, he adds, is that textures in the real world are much more complex, as are body movements, and "whether this is scalable remains to be seen."
No comments:
Post a Comment