Home   News   Article

Subscribe Now

Play the fake smile game from University of Cambridge and UCL and expose flaws in AI-powered emotion recognition




Did you know that artificial intelligence is being used to read facial expressions and detect emotions?

A citizen science project involving the University of Cambridge and UCL aims to open up a conversation on this burgeoning industry - and demonstrate its flaws - by inviting members of the public to try out the technology.

Researchers at the University of Cambroidge and UCL are asking people to try out emotion recognition technology (45857645)
Researchers at the University of Cambroidge and UCL are asking people to try out emotion recognition technology (45857645)

A website - https://emojify.info/ - has been set up encouraging people to pull faces at their webcams and smartphones to see how this AI emotion recognition technology works, before optionally sharing their thoughts on its potential impacts.

Companies use the technology to test consumer reaction to products, from cereal to video games. But it is also known to have been used in situations such as airport security, courtroom trials, medical care and job interviews.

It is often used without public knowledge or consent, raising questions around ethics and privacy, and its use in policing. A coalition of more than 40 Civil Society organisations have called for a ban of the technology in the EU.

“Many people are surprised to learn that emotion recognition technology exists and is already in use. Our project gives people a chance to experience these systems for themselves and get a better idea of how powerful they are, but also how flawed,” said Dr Alexa Hagerty, project lead and researcher at the University of Cambridge Leverhulme Centre for the Future of Intelligence.

“The science behind emotion recognition is shaky. It assumes that are facial expressions perfectly mirror our inner feelings. If you’ve ever faked a smile, you know that it isn’t always the case.

“And, emotion recognition also has the same worrying potential for discrimination and surveillance as other forms of facial recognition.”

One study showed such systems consistently read black faces as more angry than white faces, no matter what their expression.

The award-winning documentary film Coded Bias highlighted how researchers at MIT showed that facial recognition technology is most accurate for white men.

Dr Igor Rubinov of Dovetail Labs, a consultancy specialising in technology ethics, who directed the design of the interactive research website, said: “Our research is designed to involve people in a very engaging way. We want people to interact with an emotion recognition system and see how AI scans their faces and what it might get wrong.”

Head designer Juweek Adolphe added: “It is meant to be fun but also to make you think about the stakes of this technology.”

A number of major companies, including Microsoft, Amazon and IBM, have halted sales of facial recognition technology and this month, the team managing ImageNet, one of the largest datasets used to train facial recognition, blurred 1.5 million images in response to privacy concerns.

Dr Alexandra Albert, of the Extreme Citizen Science (ExCiteS) research group at University College London, said: “There hasn’t been real public input or deliberation about these technologies. They scan your face, but it is tech companies who make the decisions about how they are used. We need a more democratic approach.”

Visitors to the website can activate a secure and private emotion recognition system that scans their face and assigns a confidence level to emotions.

Researchers at the University of Cambroidge and UCL are asking people to try out emotion recognition technology (45857643)
Researchers at the University of Cambroidge and UCL are asking people to try out emotion recognition technology (45857643)

They can play a game to ‘beat the machine’ by changing their facial expressions to register six emotions. The system does not collect or save images or data from the expressions, but participants can share their thoughts on the technology afterwards.

“Everyone has an important perspective to share that can help us better understand emotion recognition technology, open a public conversation about its use, and shape future citizen-led research. Citizen science projects like ours have an important role to play to ensure that everyone has a say in how tech companies develop these technologies, how businesses use them, and how government monitors and controls them” said Dr Albert.

A 2019 review by the Association for Psychological Science (APA) concluded there was no scientific basis for the common assumption “that a person’s emotional state can be readily inferred from his or her facial movements”.

“Technologies as powerful, flawed and far-reaching as emotion recognition require input from everyone whose lives they touch,” said Dr Hagerty. “We have to be sure technology benefits society.”

The project is supported by a Nesta Collective Intelligence grant, which seeks to mobilise how citizen science, crowd sourcing and other forms of collective intelligence can be used to tackle complex social problems, including addressing bias in data and auditing AI systems.

And - referencing her first name, Alexa, she joked: “Maybe it is my fate to research how our lives are shaped by technology, since Amazon stole my name...”

Read more

Cambridge scientists move step closer to holy grail of particle physics with discovery that questions laws of nature

Sir David Spiegelhalter: How we’re manipulated by numbers in the news

AI ethics course at Cambridge University is a first

Sign up for our weekly newsletter for the latest Cambridge science direct to your inbox



This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies - Learn More