Cranial connection
‘Hyperscanning,’ a set of techniques for simultaneously measuring brain activity in two people, is yielding insights into autism.
One of the challenges in studying autism is that social behavior involves interactions between brains, but traditional brain imaging only provides a peek into one brain at a time.
Enter ‘hyperscanning,’ a set of techniques for simultaneously measuring brain activity in two people as they interact in various ways. Researchers first tried hyperscanning about a decade ago, and as it becomes more popular, they are developing variations on the approach that are suited to answering different types of research questions.
For example, in a study published 7 November in the Journal of Neuroscience, researchers at Beijing Normal University conducted functional near-infrared spectroscopy in pairs of adults who were either facing each other or seated back to back.
This method involves both individuals wearing a net of sensors on their heads, similar to that used in electroencephalography. It allows the participants to be together in an everyday setting and communicate in a natural way using gestures and facial expressions.
The researchers found that there’s something special about face-to-face dialog that’s distinct from other forms of communication.
People show similar patterns of brain activity in the left inferior frontal cortex, where mirror neurons are located, when they are facing each other and having a conversation, the study found. This neural synchronization doesn’t occur when the participants are seated back to back, nor when one partner is delivering a monologue and the other is just listening.
The results emphasize the importance in successful communication of paying attention to facial expressions and taking turns talking — both tasks with which people who have autism often struggle.
Similarly, a group of Japanese researchers reported in September in Frontiers in Neuroscience that brain activity in the right inferior frontal gyrus aligns when two controls make eye contact, but is less synchronized when a control is paired with a high-functioning individual who has autism.
This second study used functional magnetic resonance imaging, with pairs of participants in separate scanners connected via the Internet. On a computer screen inside the scanner, each participant can see the partner’s eyes as well as a pair of colored balls. The researchers asked the participants to look at their partner’s eyes, then follow the partner’s gaze to one or another of the balls.
Pairs that include one person with autism are not as good at this task as pairs of controls, the researchers found. When they try to follow the partner’s gaze, people with high-functioning autism show lower activity than controls do in a brain region called the occipital pole, which is involved in the early stage of gaze processing.
In contrast, controls paired with an individual who has autism show higher activity in the occipital cortex and in the right prefrontal area, suggesting that their brains may work extra to compensate for the low or abnormal eye contact from their partner.
The evidence from these studies that social interaction involves neural synchronization seems to me to be profound. We are used to thinking of our thoughts as our own, but sometimes what’s most important may be that they are shared — and with little conscious effort on our part.