Thursday, 3 January 2013

Thoughts Control Avatar-Like Robot


We've reported on robots that recognize human gestures and a robotic exohand that gives a human hand superhuman strength. Researchers have taken these abilities a few steps further using functional magnetic resonance imaging (fMRI) to enable thought control of a robot thousands of miles away.

In a scenario reminiscent of the movie Avatar, the researchers used an fMRI machine to scan the brain activity of Tirosh Shapira, a university student in Israel, in real-time. By imagining moving different parts of his body, Shapira controlled a small, humanoid robot located in France at the Béziers Technology Institute. The commands were sent over an Internet connection. A camera mounted on the robot's head let Shapira view its environment.

Before the experiment, Shapira received training by researchers at Bar-Ilan University in Israel. During training, he tried to direct a virtual avatar by imagining himself moving one of his legs or hands. Using changes in blood flow to the brain's primary motor cortex measured by the fMRI scanner, the team created an algorithm to distinguish among different thoughts about moving different body parts.

When Shapira imagined moving his legs, the robot walked forward. Thinking about moving either his left or right hand made the robot turn 30 degrees to the left or right. Since time is required between the beginning of neural activity and the point at which an intended movement can be classified, this caused a short communication delay. (Watch a video here of the robot being controlled by thought to follow another person.)

The experiment's long-term goal is to create a surrogate that works like the one in the movie, said Abderrahmane Kheddar, director of the joint robotics laboratory at the National Institute of Advanced Industrial Science and Technology in Japan, in an article in New Scientist detailing the research. The Institute is part of the international Virtual Embodiment and Robotic Re-embodiment (VERE) project.

The VERE project is focused on two goals: Avatar-like embodiment in a remote, surrogate robotic body controlled via a brain-computer interface, and virtual reality embodiment with a virtual representation. Applications are the rehabilitation and training by virtual embodiment of people who are confined to a bed or wheelchair, and physical, robotic embodiment for people who are immobilized.

The project's fundamental research areas include neuroscience, constructing embodiment machines that deliver virtual sensory data to and read signals from participants, monitoring brain and physiological signals to recognize the participants' intentions, embodying their intentions in actions by a physical robot, the technology underlying virtual embodiment, and a software development platform.

The researchers say the next step is to improve the robot surrogate by upgrading to the HRP-4, made by Kawada Industries in Japan. This robot has a more stable and dynamic walk, and is almost the height of a human, which will increase the feeling of actual embodiment, said Kheddar

No comments:

Post a Comment