Avatar becomes reality: Hi-tech head cap lets disabled man `move` a robot slave 60 miles away using just his brain power

A+ A

Baku, May 11 (AZERTAC). Disabled people could one day be waited on hand and foot by a robot servant - or even send out the robot into society in their place. It sounds like the plot of James Cameron`s sci-fi hit Avatar, but Swiss scientists showed off today how a partially paralyzed person can control a robot by thought alone. Simply by thinking about lifting his fingers, a patient was able to `move` a robot 100 miles away. The scientists hope the technology will one day allow immobile people to interact with their surroundings through so-called avatars. Similar experiments have taken place in the United States and Germany, but they involved either able-bodied patients or invasive brain implants. On Tuesday, a team at Switzerland`s Federal Institute of Technology in Lausanne used only a simple head cap to record the brain signals of Mark-Andre Duc, who was at a hospital in the southern Swiss town of Sion 100 kilometers (62 miles) away. Duc`s thoughts - or rather, the electrical signals emitted by his brain when he imagined lifting his paralyzed fingers - were decoded almost instantly by a laptop at the hospital. The resulting instructions - left or right - were then transmitted to a foot-tall robot scooting around the Lausanne lab. Duc lost control of his legs and fingers in a fall and is now considered partially quadriplegic. He said controlling the robot wasn`t hard on a good day. `But when I`m in pain it becomes more difficult,` he said through a video link screen on a second laptop attached to the robot. Background noise caused by pain or even a wandering mind has emerged as a major challenge in the research of so-called brain-computer interfaces since they first began to be tested on humans more than a decade ago, said Jose Millan, who led the Swiss team. While the human brain is perfectly capable of performing several tasks at once, a paralyzed person would have to focus the entire time they are directing the device. `Sooner or later your attention will drop and this will degrade the signal,` Millan said. To get around this problem, his team decided to program the computer that decodes the signal so that it works in a similar way to the brain`s subconscious. Once a command such as `walk forward` has been sent, the computer will execute it until it receives a command to stop or the robot encounters an obstacle. The robot itself is an advance on a previous project that let patients control an electric wheelchair. By using a robot complete with a camera and screen, users can extend their virtual presence to Rajesh Rao, an associate professor at the University of Washington, Seattle, who has tested similar systems with able-bodied subjects, said the Lausanne team`s research appeared to mark an advance in the field. `Especially if the system can be used by the paraplegic person outside the laboratory,` he said in an email. Millan said that although the device has already been tested at patients` homes, it isn`t as easy to use as some commercially available gadgets that employ brain signals to control simple toys, such Mattel`s popular MindFlex headset. `But this will come in a matter of years,` Millan said.

© Content from this site must be hyperlinked when used.


Fields with * are required.

Please enter the letters as they are shown in the image above.
Letters are not case-sensitive.