Think you could easily distinguish the expressions for “happily disgusted” or “sadly angry?” New computational mapping has made it possible for computers to do just that. Researchers at Ohio State University have cataloged an impressive 21 distinct, complex human facial expressions and made it possible for computers to recognize their subtle differences. The team hopes to use this study to track and map the origins of human emotions in the brain and even help to diagnose conditions like autism and PTSD. From OSU.edu:
Until now, cognitive scientists have confined their studies to six basic emotions—happy, sad, fearful, angry, surprised and disgusted—mostly because the facial expressions for them were thought to be self-evident, Martinez explained.
But deciphering a person’s brain functioning with only six categories is like painting a portrait with only primary colors, Martinez said: it can provide an abstracted image of the person, but not a true-to-life one.
What Martinez and his team have done is more than triple the color palette—with a suite of emotional categories that can be measured by the proposed computational model and applied in rigorous scientific study.
“In cognitive science, we have this basic assumption that the brain is a computer. So we want to find the algorithm implemented in our brain that allows us to recognize emotion in facial expressions,” he said. “In the past, when we were trying to decode that algorithm using only those six basic emotion categories, we were having tremendous difficulty. Hopefully with the addition of more categories, we’ll now have a better way of decoding and analyzing the algorithm in the brain.”