Large language models
Recent articles
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.
‘Digital humans’ in a virtual world
By combining large language models with modular cognitive control architecture, Robert Yang and his collaborators have built agents that are capable of grounded reasoning at a linguistic level. Striking collective behaviors have emerged.
‘Digital humans’ in a virtual world
By combining large language models with modular cognitive control architecture, Robert Yang and his collaborators have built agents that are capable of grounded reasoning at a linguistic level. Striking collective behaviors have emerged.
Are brains and AI converging?—an excerpt from ‘ChatGPT and the Future of AI: The Deep Language Revolution’
In his new book, to be published next week, computational neuroscience pioneer Terrence Sejnowski tackles debates about AI’s capacity to mirror cognitive processes.
Are brains and AI converging?—an excerpt from ‘ChatGPT and the Future of AI: The Deep Language Revolution’
In his new book, to be published next week, computational neuroscience pioneer Terrence Sejnowski tackles debates about AI’s capacity to mirror cognitive processes.
Explore more from The Transmitter
Tatiana Engel explains how to connect high-dimensional neural circuitry with low-dimensional cognitive functions
Neuroscientists have long sought to understand the relationship between structure and function in the vast connectivity and activity patterns in the brain. Engel discusses her modeling approach to discovering the hidden patterns that connect the two.
Tatiana Engel explains how to connect high-dimensional neural circuitry with low-dimensional cognitive functions
Neuroscientists have long sought to understand the relationship between structure and function in the vast connectivity and activity patterns in the brain. Engel discusses her modeling approach to discovering the hidden patterns that connect the two.
Beyond the algorithmic oracle: Rethinking machine learning in behavioral neuroscience
Machine learning should not be a replacement for human judgment but rather help us embrace the various assumptions and interpretations that shape behavioral research.
Beyond the algorithmic oracle: Rethinking machine learning in behavioral neuroscience
Machine learning should not be a replacement for human judgment but rather help us embrace the various assumptions and interpretations that shape behavioral research.
‘Wired for Words: The Neural Architecture of Language,’ an excerpt
In his new book, Hickok provides a detailed overview of the research into the circuits that control speech and language. In this excerpt from Chapter 5, he shares how meeting his colleague David Poeppel led to them developing the theory for bilateral speech perception.
‘Wired for Words: The Neural Architecture of Language,’ an excerpt
In his new book, Hickok provides a detailed overview of the research into the circuits that control speech and language. In this excerpt from Chapter 5, he shares how meeting his colleague David Poeppel led to them developing the theory for bilateral speech perception.