Large language models
Recent articles
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.
‘Digital humans’ in a virtual world
By combining large language models with modular cognitive control architecture, Robert Yang and his collaborators have built agents that are capable of grounded reasoning at a linguistic level. Striking collective behaviors have emerged.
‘Digital humans’ in a virtual world
By combining large language models with modular cognitive control architecture, Robert Yang and his collaborators have built agents that are capable of grounded reasoning at a linguistic level. Striking collective behaviors have emerged.
Are brains and AI converging?—an excerpt from ‘ChatGPT and the Future of AI: The Deep Language Revolution’
In his new book, to be published next week, computational neuroscience pioneer Terrence Sejnowski tackles debates about AI’s capacity to mirror cognitive processes.
Are brains and AI converging?—an excerpt from ‘ChatGPT and the Future of AI: The Deep Language Revolution’
In his new book, to be published next week, computational neuroscience pioneer Terrence Sejnowski tackles debates about AI’s capacity to mirror cognitive processes.
Explore more from The Transmitter
The Transmitter’s most-read neuroscience book excerpts of 2025
Books by Nachum Ulanovsky, Nicole Rust, and Andrew Iwaniuk and Georg Striedter made the list of some of the year's most engaging neuroscience titles.
The Transmitter’s most-read neuroscience book excerpts of 2025
Books by Nachum Ulanovsky, Nicole Rust, and Andrew Iwaniuk and Georg Striedter made the list of some of the year's most engaging neuroscience titles.
Neuroscience’s leaders, legacies and rising stars of 2025
Here are seven stories from the past year about some of the field’s most engaging figures.
Neuroscience’s leaders, legacies and rising stars of 2025
Here are seven stories from the past year about some of the field’s most engaging figures.
The Transmitter’s top news articles of 2025
Check out some of our most-read stories, covering neuroscience funding and policy changes in the United States, and methodological issues in high-profile neuroscience papers.
The Transmitter’s top news articles of 2025
Check out some of our most-read stories, covering neuroscience funding and policy changes in the United States, and methodological issues in high-profile neuroscience papers.