Is the patient cringing with neck pain telling the truth despite no evidence of trauma, or is he just looking to collect some insurance money? Thanks to a fully automated facial expression recognition system that operates in real time, the answer can now be discerned with a little more confidence.
A joint study from researchers at the University of California, San Diego, the State University of New York’s University at Buffalo and the University of Toronto has found that a computer vision system can distinguish between real and faked expressions of pain more accurately than humans do.
The study involved two experiments with 205 human observers who were asked to assess the authenticity of people’s painful expressions in video clips, some of whom were being subjected to actual pain induction, whereas others were faking their emotion.
The researchers found that even after training, people could not differentiate between the two actions more than 55% of the time. In contrast, using a machine that automatically measures facial movements and performs pattern recognition on those movements, 85% accuracy was attained.
The system’s superiority is attributed to its ability to distinguish the subtle differences between pyramidally and extrapyramidally driven movements that people cannot decode. The study showed that the most identifiable feature of pretended pain is how and when the mouth opens and closes. Pain fakers’ mouths open with less variation and too regularly.
The researchers published their findings in Current Biology (2014;24:738-743).
Leave a Reply
You must be logged in to post a comment.