Lex Fridman PodcastVladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5
Episode Details
EPISODE INFO
- Released
- November 16, 2018
- Duration
- 54m
- Channel
- Lex Fridman Podcast
- Watch on YouTube
- ▶ Open ↗
SPEAKERS
Lex Fridman
hostVladimir Vapnik
guestNarrator
other
EPISODE SUMMARY
In this episode of Lex Fridman Podcast, featuring Lex Fridman and Vladimir Vapnik, Vladimir Vapnik: Statistical Learning | Lex Fridman Podcast #5 explores vladimir Vapnik on learning, intelligence, and the limits of deep learning Vladimir Vapnik discusses the philosophical and mathematical foundations of statistical learning, contrasting instrumentalism (prediction) with realism (understanding "God's laws"). He argues that modern machine learning overemphasizes brute-force prediction and deep learning, while neglecting conditional probabilities, invariants, and the role of a "teacher" in providing powerful predicates. Vapnik introduces his view that there are two mechanisms of learning—strong and weak convergence—with weak convergence relying on high‑level predicates like “swims like a duck” that dramatically reduce data requirements. He sees the central open problem as understanding intelligence: how good teachers generate such predicates, and how to formalize that process to achieve learning with far fewer examples.
RELATED EPISODES
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome




