Video test measures subtleties of social cognition
Social cognition tests using videos of actors performing emotional expressions and scenes can measure subtle impairments characteristic of high-functioning people with autism, according to unpublished research presented at the Society for Neuroscience annual meeting in Washington, D.C.
Social cognition tests using videos of actors performing emotional expressions and scenes can measure subtle impairments characteristic of high-functioning people with autism, according to unpublished research presented at the Society for Neuroscience annual meeting in Washington, D.C.
Many social cognition tests ask subjects to interpret written stories or cartoon-like drawings that are much less vivid than real life. They are also frequently limited to basic emotions such as ‘happy’ or ‘angry,’ rather than the complex ones — ‘enthusiastic,’ for example, or ‘in love’ — that tend to flummox high-functioning people with autism.
Researchers led by Hauke Heekeren and Isabel Dziobek at Freie Universität Berlin in Germany filmed 60 actors expressing 40 different emotions. They also produced a series of 30-second scenes with two or three people interacting in different social situations.
Some of the scenes have a handful of alternate endings. The researchers ask participants to view the video and pick the ending that reflects how they think the characters might react based on the emotions and relationships portrayed in the video.
This is known as an implicit test of social cognition. “It’s not labeling the emotion, but getting a sense for it, a gut feeling,” says Dorit Kliemann, a graduate student who presented some of the work at a poster session on Tuesday.
Bhismadev Chakrabarti, lecturer in neuroscience at the University of Reading in the United Kingdom, is impressed with the group’s work. “This is one of the most ecologically valid tasks that I’ve seen in emotion research in autism. It’s very close to real life,” he says of the social scenes.
The researchers’ new battery also includes explicit tests of social cognition. For example, they ask participants to view a video of an actor making an expression, and then label it with the correct emotional term.
It’s rare for social cognition tests to include both explicit and implicit measures, says Chakrabarti. This battery of tests is especially extensive: The social scenes and the videos are the basis of both explicit and implicit tasks.
Emotional activation:
All of these tests are part of a study aimed at improving social cognition in adults with high-functioning autism. Some of the videos are being used to benchmark the participants’ ability to interpret social situations at the beginning and end of the study. Others will be part of a 12-week training program to improve their social skills.
The researchers also developed versions of the tasks that could be performed while inside a functional magnetic resonance imaging scanner. So far, the team has scanned fewer than a dozen people with autism and a similar number of controls, but these preliminary data suggest that the tests can identify differences between the two groups.
“We saw very nicely in the controls the theory of mind network,” says Gabriela Rosenblau, a graduate student who presented results from tasks involving the videos of social scenes in a poster session on Sunday.
These videos trigger particularly strong activation in the superior temporal sulcus. In fact, the better a person is at predicting what comes next, the stronger the activation in this region, the researchers found.
This lends support to the view that this brain region is involved in processing complex social cues, not just basic information as was once thought, Chakrabarti says.
When they view the videos of emotional faces, people with autism show more activation than controls in the medial prefrontal cortex. That suggests they have to work harder to identify emotions, says Kliemann, whose poster focused on this work. “They have less of a gut feeling.”
For more reports from the 2011 Society for Neuroscience annual meeting, please click here.