Lex Fridman PodcastGary Marcus: Toward a Hybrid of Deep Learning and Symbolic AI | Lex Fridman Podcast #43
At a glance
WHAT IT’S REALLY ABOUT
Gary Marcus calls for hybrid AI: deep learning plus symbols
- Gary Marcus argues that current deep learning systems are powerful but fundamentally limited, especially in common sense, abstraction, language understanding, and flexible reasoning. He contrasts narrow successes in games and perception with the broader, general intelligence humans display across diverse domains. Marcus advocates a hybrid approach that combines symbolic, rule-based reasoning and explicit knowledge with learning-based methods, inspired by human cognition and evolution. He also emphasizes the need for AI systems we can trust, capable of representing concepts like harm and ethics explicitly, and calls for more realistic public expectations and better benchmarks for true understanding.
IDEAS WORTH REMEMBERING
5 ideasDeep learning’s strengths are narrow and correlation-based.
It excels at pattern recognition tasks like image classification and certain games, but struggles with abstraction, variable-based reasoning, causal understanding, and flexible transfer to new situations.
Common sense is the core missing ingredient for current AI.
Machines lack basic physical and psychological knowledge (e.g., containers, goals, frustration, harm), which is essential for reading, language understanding, robotics, and real-world decision-making.
A hybrid of symbolic AI and learning is likely required.
Marcus argues neither expert systems (all symbols, no learning) nor pure deep learning (all learning, no explicit structure) are sufficient; future systems must integrate explicit rules, variables, and logic with powerful learning components.
General intelligence involves flexible transfer across domains.
Humans can apply knowledge from movies, life, or one game to novel variants and contexts; most current systems cannot even adapt a Go player to a slightly different board without full retraining.
True language understanding goes far beyond fluent text generation.
Models like GPT can produce grammatical output yet fail basic tests of narrative comprehension, character motivation, and consistency, revealing that surface fluency is not the same as deep understanding.
WORDS WORTH SAVING
5 quotesJust because you can build a better ladder doesn’t mean you can build a ladder to the moon.
— Gary Marcus
We have to replace deep learning with deep understanding.
— Gary Marcus
Right now we don’t have a way to translate ‘harm’ into something we can execute in Python or TensorFlow.
— Gary Marcus
Intelligence is a multi-dimensional variable… machines are superhuman in some facets and far behind my five-year-old in others.
— Gary Marcus
People have this fantasy that you can machine learn anything. There are some things you would never want to machine learn.
— Gary Marcus
High quality AI-generated summary created from speaker-labeled transcript.
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome