Thursday, 3 January 2013
Humans, Do You Speak !~+V•&T1F0()?
Software that will let people and robots communicate and plan difficult, complex tasks, such as dismantling a nuclear power plant, is under development at the University of Aberdeen, Scotland. It will translate symbols of mathematical logic into text and vice versa, so humans and robots can share two-way communication in their own respective language.
Human-robot communication has been the subject of a lot of research for robotic tasks as widely varied as assisting in brain surgery, to helping humans assemble products on the factory floor.
Researchers at the university's School of Natural and Computing Sciences expect their technology to be used in several industries. These include unmanned exploration of hostile environments, such as the deep sea or the Martian surface; as well as more mundane tasks, such as maintaining and repairing railway lines.
Software that will let people and robots communicate to plan difficult and complex tasks, such as dismantling a nuclear power plant, is being developed at a Scottish university.
(Source: Wikimedia Commons/Stefan Kühn)
In these situations, robots could become more autonomous if they could operate for long periods without continuous guidance from humans, as well as make their own decisions after processing data. The problem is, as it stands now, robots can make mistakes that aren't apparent to humans or to themselves, or do things that humans don't understand. In an operation as dangerous and complex as decommissioning a nuclear power plant, the results could be disastrous.
"Evidence shows there may be mistrust when there are no provisions to help a human to understand why an autonomous system has decided to perform a specific task, at a particular time, and in a certain way," said Dr. Wamberto Vasconcelos, senior lecturer in the Department of Computing Science, in a press release. "What we are creating is a new generation of autonomous systems, which are able to carry out a two-way communication with humans. The ability to converse with such systems will provide us with a novel tool to quickly understand, and if necessary correct, the actions of an automated system, increasing our confidence in, and the usefulness of, such systems."
To develop the autonomous robotics systems, the project will use Natural Language Generation (NLG), which translates complex information and data into simple text summaries. The university's School of Natural and Computing Sciences staff includes several NLG researchers.
In NLG, the information and data begin as symbols of mathematical logic. (Some representative logic symbols are shown in this article's headline, in no particular order.) They are automatically transformed into simple text, so that humans and robots can discuss and plan a set of tasks before the robot carries them out.
Later, when the robot is engaged in a task, the human can communicate with it using a keyboard. Humans can ask the robot questions about why it's taking certain actions or making specific decisions, and request justifications for them. Humans can also provide the robot with additional information it can integrate into its plans, suggest alternatives, and point out problems with the robot's chosen course of action.
Vasconcelos said his team hopes the systems they are developing will be applicable not only to robots, but also to mobile phones, "which can interact with a human in useful ways, which up until now haven't been explored."
The research is funded by a £1.1 million (US$1.7 million) grant from the UK's Engineering and Physical Sciences Research Council, a government agency that funds research and training.
Labels:
design news
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment