Two trained monkeys used a brain-machine-brain interface and managed to move an avatar hand to detect the texture of virtual objects - they used no part of their real bodies for any of this, scientists from Duke University Center for Neuroengineering reported in the journal Nature. The authors added that this technology could eventually be used to help quadriplegics walk again, use their hands, and sense the texture of things with their fingers.
A quadriplegic patient is paralyzed in all four limbs - both arms and legs, as may occur from a spinal cord accident. A paraplegic patient is paralyzed in the lower part of the body, including the legs.
Study leader Miguel Nicolelis, MD, PhD. said:
"Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton."
In this experiment, the monkeys used electrical brain activity to direct the virtual hands of an avatar to the surface of virtual objects. When touching them, they could distinguish the differences between them according to their textures. The monkeys managed to do this without moving any parts of their real bodies.
In computing, an avatar is an object representing a user, the graphical representation of the user.
The virtual objects in this study all looked the same, but they were designed to have different artificial textures which could only be identified if the animals virtually touched their surfaces with their virtual hands - these hands were solely controlled by their brain's electrical activity.
Tiny electrical signals sent to the monkeys' brains were perceived as varying textures, the authors explained. There were three perceived object textures, or three different electrical patterns.
The scientists say that this technology - which involves no part of the animal's real body - could be effectively used by paralyzed patients if they learnt to operate the brain-machine-brain interface. Not only would they be able to move about with an outer skeletal suite, they would also have some kind of sense of touch restored.
Nicolelis, who was also senior author, said:
"This is the first demonstration of a brain-machine-brain interface (BMBI) that establishes a direct, bidirectional link between a brain and a virtual body.
In this BMBI (brain-machine-brain interface), the virtual body is controlled directly by the animal's brain activity, while its virtual hand generates tactile feedback information that is signaled via direct electrical microstimulation of another region of the animal's cortex.
"We hope that in the next few years this technology could help to restore a more autonomous life to many patients who are currently locked in without being able to move or experience any tactile sensation of the surrounding world.
This is also the first time we've observed a brain controlling a virtual arm that explores objects while the brain simultaneously receives electrical feedback signals that describe the fine texture of objects 'touched' by the monkey's newly acquired virtual hand.
Such an interaction between the brain and a virtual avatar was totally independent of the animal's real body, because the animals did not move their real arms and hands, nor did they use their real skin to touch the objects and identify their texture.
It's almost like creating a new sensory channel through which the brain can resume processing information that cannot reach it anymore through the real body and peripheral nerves. "
The avatar arm was steered by the electrical activity of the neurons in the monkey's motor cortex - between 50 and 200 neurons. At the same time, thousands of neurons in the primary tactile cortex were receiving electrical feedback from the virtual hand's palm, so that the monkey could tell one from the other.
"The remarkable success with non-human primates is what makes us believe that humans could accomplish the same task much more easily in the near future."
It did not take the animals long to know what to do and how to do it. One monkey learned how to select the correct object after just four attempts, while the other needed to try nine times. Several of the tests clearly showed that the monkeys were really sensing the object and not choosing it randomly.
Nicolelis is convinced that his findings provide compelling evidence that creating a robotic exoskeleton for severely paralyzed patients so that they could move about and explore objects is practicable.
An exoskeleton could be completely controlled by the paralyzed human's voluntary brain activity. At the same time, sensors throughout the suit would generate a kind of tactile feedback so that his/her brain could identify and distinguish different textures, shapes, and even temperatures of various objects. The authors believe they would even be aware of the varying textures of the ground they would be walking on.
The Walk Again Project, a non-profit consortium established by a team of German, Brazilian, Swiss and American scientists, has selected this overall therapeutic approach for restoring full body mobility to quadriplegic individuals through BMBI implemented in conjunction with a full-body robotic exoskeleton.
The scientific team would like to demonstrate an autonomous exoskeleton during the opening ceremonies or opening game of the FIFA World Cup 2014 in Brazil.
Written by Christian Nordqvist