Charles the robot head and the face of human emotion

PUBLISHED: 16:33 04 March 2018

Faraday Institute scientists and robots talk at Emmanuel College, University of Cambridge, St Andrew's St, Cambridge, from left Prof Peter Robinson, Beth Singler and Prof John Wyatt. Picture: Keith Heppell

Faraday Institute scientists and robots talk at Emmanuel College, University of Cambridge, St Andrew's St, Cambridge, from left Prof Peter Robinson, Beth Singler and Prof John Wyatt. Picture: Keith Heppell

Iliffe Media Ltd

“People are fascinated by things that look like people,” says University of Cambridge professor of computer technology Peter Robinson.

He’s talking about Charles, the head which has been programmed to mimic the expressions of the person looking at it.

The head itself is built like an animatronic model from a 90s film. It’s the software that’s cutting-edge, and in Charles’ case the result of more than a decade of research.

“It says something about people, not really anything about robots,” he continues. “The robots are not coming, and if they do they won’t look like that.”

Dr Ian Davies, one of the Cambridge scientists who wrote the software, explains: “A camera is watching the person and working out what the facial expression is. It then transfers that onto the robot that mimics, and he’s in a mirroring mode so if you tilt your head, he will tilt his.”

Prof Robinson’s team have been analysing not only people’s facial expressions, but their tone of voice, body posture and gestures, to understand what the person is doing. It’s a step beyond systems like Amazon’s Alexa, which is verbal signal processing alone. They are trying to improve the way people and computers communicate.

“If you are conversing with a computer system it tends to ignore social signals. A person who did that would be considered as being on the autistic spectrum,” Prof Robinson says.

In this resprect the system is also helping reveal more about the way humans work.

Already, robots like Charles have been used to help children with autism develop their communication skills, in some cases speaking for the first time – to a robot.

Prof Robinson continues: “In that sense computers are autistic, so we’ve been trying to do something about that: to give computers some sort of emotional awareness, because that’s part of communication.

“If you want to communicate with machines, understanding people’s social signals – as well as the specific content of what they’re saying – is important.

“You might be driving your car and if you’re driving through a busy city centre that’s unfamiliar, you’re late on your way to a meeting, not quite sure where you are and your satellite navigation system tells you to make a u-turn, or that you need an oil change in 5,000 miles, that’s not a good time to do it. And the person sitting next to you would know better than to do that.

“A system that had more social awareness would realise that it should wait until a reasonable time.”

The question asked, then, is should robots ever be able to decide when to make a u-turn, or which patients should undergo a risky operation? When considering this, Prof Robinson asks, does the robot making this decision look like a human? Because robots that don’t look like humans are already making big decisions.

“Most robots aren’t like humans,” he says, “they’re the systems that do automatic trading in the city, and they are making crucial, very high-value decisions in a tiny fraction of a second without human supervision. This is not a good thing. Those are the robots to be worried about, but nobody knows they’re there.”

comments powered by Disqus

More news stories

Live Traffic Map

Most read stories

Image alt text goes here

Find the perfect role for you – or advertise a vacancy

Find out more

Image alt text goes here

Search for your next home – and read our sparkling content

Find out more

Image alt text goes here

Share your news, pictures and videos with us

Find out more