More and more, computers are showing their superiority over humans in a multitude of tasks. A new study reveals that a computer system is able to detect – with better accuracy than a human – whether our expressions of pain are genuine or phony.
The researchers, from the University of California-San Diego (UCSD) and the University of Toronto in Canada, have published their findings in the journal Current Biology.
They note that in social species like humans, faces have evolved to show valuable information in social contexts, and this includes expressions of emotions and pain.
However, “humans can simulate facial expressions and fake emotions well enough to deceive most observers,” says Prof. Kang Lee, senior author from the University of Toronto.
According to the study, there are two motor pathways in the brain that control facial movement:
- Subcortical extrapyramidal motor system – which drives spontaneous facial expressions of felt emotions
- Cortical pyramidal motor system – which controls voluntary (faked) facial expressions.
While humans are unable to consistently spot the subtle differences between the two, the team says a computer can.
“The computer system managed to detect distinctive dynamic features of facial expressions that people missed,” says Marian Bartlett, lead author from UCSD. “Human observers just aren’t very good at telling real from faked expressions of pain.”
In their experiment, the researchers found that humans were not able to tell the difference between real and fake expressions of pain better than random chance. Even after training, they only improved their accuracy to a mere 55%.
But the team says computer vision may be able to identify the subtle difference between pyramidally and extrapyramidally driven movements.
The system, which automatically measures facial movements and performs pattern recognition, was able to distinguish between the real and fake emotions with 85% accuracy.
The study suggests that the single most predictive aspect of faked expressions is the mouth. Apparently, when people are faking their emotions, their mouths open too regularly and with less variation.
Prof. Lee says that “because of the way our brains are built, people can simulate emotions they’re not actually experiencing – so successfully that they fool other people.”
“The computer is much better at spotting the subtle differences between involuntary and voluntary facial movements,” he adds.
Prof. Bartlett says that by using computer-vision systems to elucidate the dynamics of facial action, their technique “has the potential to elucidate ‘behavioral fingerprints’ of the neural-control systems involved in emotional signaling.”
The researchers say future research will investigate whether “over-regularity” is a normal feature of false expressions, and they note that the computer-vision system could also be used to identify other deceptive actions in the areas of homeland security, psychopathology, job screening, medicine and law.
“As with causes of pain,” Prof. Bartlett says, “these scenarios also generate strong emotions, along with attempts to minimize, mask and fake such emotions, which may involve ‘dual control’ of the face.”
“In addition, our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”
Medical News Today recently reported on a study that suggested our eyes widen in fear and narrow in disgust as a way of harnessing the properties of light most useful in those situations.