Artificial intelligence (AI) systems are being developed nowadays to infer people’s intentions and reactions by studying their facial expressions. But a new study says that such conjectures by AI cannot be very reliable. A recent study analysed photos of actors to examine the relation between facial expressions and human emotions. They found that people could use similar expressions to portray different emotions. Whereas, the same emotion could be expressed in different ways. The research also found that much of the inference depended on context. So, to judge people’s inner thoughts simply by analysing their facial expressions through an algorithm can be a flawed method.

Researchers marked 13 emotion categories under which they analysed facial expressions from 604 photographs of professional actors. The actors were given emotion-evoking scenarios to which they would have to react. However, the descriptions did not suggest in any way what to feel about these scenarios.

The study was published in Nature Communications. The 13 categories were made through the judgement of 839 volunteers and the Facial Action Coding System that relates certain action units to certain movements of facial muscles. Machine learning (ML) analyses revealed to researchers that actors portrayed the same emotion categories by contorting their faces in different ways. At the same time, similar expressions did not always reveal the same emotions.

The study was run in two groups. In one, 842 people marked roughly 30 faces each under the 13 emotion categories. In the second group, 845 people rated roughly 30 face-and-scenario pairs each. The results from the two groups differed in most cases. This led to the conclusion that analysing facial expressions out of context can lead to misleading judgements. Therefore, the context was important to know the emotional intentions of a person.

“Our research directly counters the traditional emotional AI approach,” Lisa Feldman Barrett, professor of psychology at Northeastern University College of Science and one of the seven researchers behind the study, said.

The researchers also wrote that these findings “join other recent summaries of the empirical evidence to suggest that scowls, smiles, and other facial configurations belong to a larger, more variable repertoire of the meaningful ways in which people move their faces to express emotion.”

A few months ago, a researcher sought regulations on AI tools being pushed in schools and workplaces to interpret human emotions. Kate Crawford, academic-researcher and the author of the book “The Atlas of AI,”, said that that “unverified systems” were “used to interpret inner states,” and added that such technology needs to be regulated for better policy-making and public trust.


Are the Galaxy Z Fold 3 and Z Flip 3 still made for enthusiasts — or are they good enough for everyone? We discussed this on Orbital, the Gadgets 360 podcast. Orbital is available on Apple Podcasts, Google Podcasts, Spotify, Amazon Music and wherever you get your podcasts.
You May Also Like
Inhalable nanosensors could increase access to lung cancer screening – Physics World

Inhalable nanosensors could increase access to lung cancer screening – Physics World

Inhalable nanosensors could increase access to lung cancer screening – Physics World…
Distant Galaxies Crash to Produce Massive Sonic Boom, Could Reveal Secrets About the Universe: Report

Distant Galaxies Crash to Produce Massive Sonic Boom, Could Reveal Secrets About the Universe: Report

One of the most intense cosmic shockwaves has been observed in Stephan’s…

Siberian Tundra Could Virtually Disappear Due To Rising Global Temperatures, Says Study

Global warming is posing an existential threat to the Siberian tundra, the…
Pig Hearts Transplanted Into Two Dead Persons in Quest to Save Humans With Animal Organs

Pig Hearts Transplanted Into Two Dead Persons in Quest to Save Humans With Animal Organs

New York researchers transplanted pig hearts into two brain-dead people over the…