Robots are one step closer to communicating “more seamlessly” with humans as computer scientists have developed a system which allows them to better understand instructions.

Current robot language models cannot fully understand complex commands.

While they can pick up cues from words and sentence structure to carry out an action, the machines are limited in taking on specific moves which involve more steps.

A personal companion robot (Aaron Chown/PA)

For example, a robotic forklift might understand a singular command such as “tilt the forks back a little”. However, it might not understand “grab the pallet,” as the specific instruction involves several smaller steps.

The gap in understanding can lead to a robot responding incorrectly.

“The issue we’re addressing is language grounding, which means having a robot take natural language commands and generate behaviours that successfully complete a task,” researcher Dilip Arumugam said.

“The problem is that commands can have different levels of abstraction, and that can cause a robot to plan its actions inefficiently or fail to complete the task at all.”

The new system, developed by students at Brown University, does not just infer the desired task, but analyses the language to accurately pick up different layers of instruction.

To test this, the developers used crowdsourcing online marketplace, Mechanical Turk, along with a robotic agent – in this case, a virtual chair – which could be moved between different coloured rooms online.

The system makes it easier for robots to follow instructions (Tellex Lab/Brown University)

Volunteers from the Mechanical Turk marketplace gave commands to the chair on three different levels of specificity. Firstly, they would ask the chair to move from the red room to the blue room.

Then, they would command the chair to move to a different room via a series of steps, such as: “Take five steps north, turn right…turn left, take five steps south.”

A third form of instruction was a combination of the above two.

The volunteers’ commands trained the computer system to understand more complex demands, by helping it to glean commands at different levels of abstraction as opposed to just step-by-step instructions.

The combination of specific and step-by-step commands was proved effective.

When the students used it physically, on a “Roomba-like robot”, the machine responded to instructions within just one second most of the the time, as opposed to responding after a 20-second delay without the specific instruction.