Lex Fridman PodcastNoam Chomsky: Language, Cognition, and Deep Learning | Lex Fridman Podcast #53
At a glance
WHAT IT’S REALLY ABOUT
Noam Chomsky on Language, Human Limits, and AI’s Blind Spots
- Lex Fridman interviews Noam Chomsky about the nature of language, cognition, and the limits of human and machine intelligence. Chomsky explains language as an internal mental faculty that underlies thought, with external speech being a secondary use, and argues that language is central to human reasoning and creativity. He contends that biological systems, including our cognitive capacities, have built‑in scope and limits, illustrated historically by physics’ acceptance of unintelligible phenomena. Chomsky is skeptical that deep learning reveals much about human language or mind, seeing it as useful engineering but weak science, and closes with reflections on human nature, institutions, mortality, and life’s self-created meaning.
IDEAS WORTH REMEMBERING
5 ideasLanguage is primarily an internal system that underlies thought.
Chomsky argues that what we call language is chiefly a mental faculty localized in the brain, which generates structured thoughts; speech and writing are secondary ‘externalizations’ of this inner system.
Language and arithmetic-like structure may be universal enough for alien communication.
Referencing Marvin Minsky’s work with simple Turing machines, Chomsky suggests that any sufficiently intelligent species would likely have arithmetic, and that human language’s core operations approximate arithmetic, offering a possible shared basis for communication.
Human cognition has both rich scope and hard biological limits.
Just as our genes allow complex mammalian vision but preclude insect eyes or wings, our cognitive endowment enables powerful reasoning but likely blocks certain forms of understanding, challenging the assumption that humans can in principle comprehend everything.
A key property of language is structure dependence, not simple word order.
Interpretation often relies on hierarchical structure rather than linear proximity (e.g., where “carefully” attaches in a sentence), showing that speakers unconsciously use complex structural computations on representations they never directly hear.
Deep learning is useful engineering but currently weak cognitive science.
Chomsky sees systems like Google Translate as valuable tools that find patterns in huge corpora, but notes they are optimized for coverage of random ‘experiments’ (sentences), not for testing theoretical questions, and they can perform just as well on languages that violate known principles of human grammar.
WORDS WORTH SAVING
5 quotesMost of your use of language is thought, internal thought.
— Noam Chomsky
There may be limits to understanding. We understand the theories, but the world that they describe doesn’t make any sense.
— Noam Chomsky
Science tries to find critical experiments… It doesn’t care about coverage of millions of experiments.
— Noam Chomsky
A book expands your cognitive capacity. This [brain–machine interface] could expand it too. But it’s not a fundamental expansion.
— Noam Chomsky
The significance of your life is something you create.
— Noam Chomsky
High quality AI-generated summary created from speaker-labeled transcript.
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome