Baby in mother's lap watching her mouth move as she talks.
Little learners: Neurotypical infants watch the mouths of adults as they speak, picking up visual clues about the sounds they’re hearing.
JGI/Jamie Grill / Getty Images

Infant siblings of autistic children miss language-learning clues

So-called ‘baby sibs’ watch adults’ faces just as much as children without autistic siblings do, but they don’t understand spoken language as well.

By Jaclyn Jeffrey-Wilensky
6 May 2021 | 3 min read

Babies with autistic older siblings pay close attention to adults’ mouths during speech, but they don’t reap the language-learning benefits associated with it, according to new unpublished research. By contrast, the more that children without autistic siblings watch speakers’ mouths as babies, the better they tend to understand spoken language as toddlers.

The results suggest that children with an increased likelihood of having autism, including ‘baby sibs,’ may need extra support to build language skills.

“Attention is not enough,” says Hannah Feiner, lead investigator and fellow at Yale University. “There’s some downstream component [of language development] that we have to be researching and targeting and assessing.”

Feiner presented the findings virtually yesterday at the 2021 International Society for Autism Research annual meeting. (Links to abstracts may work only for registered conference attendees.)

Autistic people often look at different parts of a scene or image than non-autistic people do, eye-tracking studies show. And in babies, these distinctive gaze patterns can often predict autism diagnoses, researchers have found. Where babies look is also connected to their language learning: Non-autistic infants watch the mouths of adults as they speak, picking up visual clues about the sounds they’re hearing.

For the new work, the researchers recruited 90 12-month-old baby sibs, as well as 61 babies without an autistic sibling. They tracked the infants’ gazes as they watched a video of an actor speaking while surrounded by toys. They calculated how much time the babies spent staring at the screen in total, and what proportion of that time they spent looking at the actor’s face and mouth. The researchers also tested the babies’ expressive and receptive language abilities at ages 12 and 18 months using the Mullen Scales of Early Learning.

The baby sibs and the control babies spent about the same amount of time gazing at the actor’s face, the team found. But the baby sibs had poorer receptive language skills at the 18-month mark than the other infants did. And though time spent gazing at the face and mouth correlated with receptive language skills among the controls, there was no such association for the baby sibs.

Baby sibs may observe, but not internalize, the linguistic clues that help neurotypical infants master speaking and understanding language, hypothesizes Katarzyna Chawarska, co-investigator and professor of child psychiatry at Yale University

“Attention feeds learning,” she says. “But attention in and of itself does not guarantee that the information about the stimuli — their structure, their value — is going to be abstracted, remembered and used.”

If that theory is confirmed, Chawarska says, it could lead to tailored language interventions for children who have an increased chance of being diagnosed with autism.

Read more reports from the 2021 International Society for Autism Research annual meeting.

Sign up for the weekly Spectrum newsletter.

Stay current with the latest advancements in autism research.