Jeremy Howard: fast.ai Deep Learning Courses and Research | Lex Fridman Podcast #35

Jeremy Howard: fast.ai Deep Learning Courses and Research | Lex Fridman Podcast #35

Lex Fridman PodcastAug 27, 20191h 44m

Lex Fridman (host), Jeremy Howard (guest), Narrator, Narrator

Jeremy Howard’s background, early programming, and love of data-centric toolsProgramming language paradigms (APL/J, K, Delphi, Perl, Python, Swift) and their impact on deep learning researchLimits of current deep learning tooling (Python, CUDA, TensorFlow) and promise of Swift + MLIRfast.ai’s mission: democratizing deep learning via practical courses, libraries, and cloud accessibilityApplied deep learning in medicine and global healthcare (triage, diagnostics, regulation, data privacy)Research vs practice: transfer learning, active learning, optimization and DAWNBench performance workLearning, teaching, and careers: how to become effective in deep learning and build meaningful startupsEthics, labor displacement, and responsibilities of data scientists

In this episode of Lex Fridman Podcast, featuring Lex Fridman and Jeremy Howard, Jeremy Howard: fast.ai Deep Learning Courses and Research | Lex Fridman Podcast #35 explores jeremy Howard on democratizing deep learning, tools, and real impact Jeremy Howard discusses his path from early programming and music to founding fast.ai, emphasizing a lifelong focus on practical data work and useful tools. He contrasts different programming languages and paradigms, arguing current Python-based deep learning stacks are powerful but fundamentally constrained for real innovation, and outlines why he’s betting on Swift and MLIR-style compilers. A large portion of the conversation centers on making deep learning accessible—through fast.ai courses, better tooling, and cloud setups—so domain experts (especially in areas like medicine) can solve real problems with limited data and compute. He also critiques academic incentives, over-reliance on big data and big compute, and stresses the ethical responsibilities and labor implications of AI.

Jeremy Howard on democratizing deep learning, tools, and real impact

Jeremy Howard discusses his path from early programming and music to founding fast.ai, emphasizing a lifelong focus on practical data work and useful tools. He contrasts different programming languages and paradigms, arguing current Python-based deep learning stacks are powerful but fundamentally constrained for real innovation, and outlines why he’s betting on Swift and MLIR-style compilers. A large portion of the conversation centers on making deep learning accessible—through fast.ai courses, better tooling, and cloud setups—so domain experts (especially in areas like medicine) can solve real problems with limited data and compute. He also critiques academic incentives, over-reliance on big data and big compute, and stresses the ethical responsibilities and labor implications of AI.

Key Takeaways

Practical problems and domain expertise should drive deep learning work.

Howard argues the most valuable AI work comes from domain experts applying deep learning to real problems (e. ...

Get the full analysis with uListen AI

You rarely need “big data” or massive compute to get state-of-the-art results.

Through transfer learning and careful experimentation, fast. ...

Get the full analysis with uListen AI

Current Python/CUDA stacks limit innovation; better compiler and language design are crucial.

Because Python is slow and forces critical loops into low-level CUDA C, many ideas (e. ...

Get the full analysis with uListen AI

Research incentives push academics toward tiny gains on trendy problems, not impactful ideas.

He notes that fields like transfer learning and active learning, which radically reduce data and labeling needs, were largely ignored until someone showed dramatic results, while papers that improve well-studied benchmarks by small margins are rewarded.

Get the full analysis with uListen AI

The fastest path to competence in deep learning is to train many models on your own data.

Howard stresses that learners should quickly fine-tune pretrained models on personally relevant datasets, inspect inputs/outputs and errors, and iterate; this builds intuition far more effectively than reading theory alone.

Get the full analysis with uListen AI

Startups and careers benefit more from tenacity and frugality than from venture capital.

He recommends bootstrapping, keeping costs minimal, charging for small early projects, and avoiding VC pressure to overgrow; persistence and solving a problem you truly understand matter more than big funding.

Get the full analysis with uListen AI

AI practitioners have ethical duties around bias, feedback loops, and human oversight.

Given the leverage of modern models, Howard says data scientists must design for appeals processes, transparency, human-in-the-loop control, and be conscious of broader social impacts like labor displacement and inequality.

Get the full analysis with uListen AI

Notable Quotes

Most of the research in the deep learning world is a total waste of time.

Jeremy Howard

I think all the major breakthroughs in AI in the next 20 years will be doable on a single GPU.

Jeremy Howard

We don’t need more experts creating slightly evolutionary research in areas that everybody is studying. We need experts using deep learning to solve real problems.

Jeremy Howard

The key differentiator between people that succeed and people that fail is tenacity.

Jeremy Howard

I don’t think we need a lot of new technological breakthroughs to do a lot of great work right now.

Jeremy Howard

Questions Answered in This Episode

How would deep learning research priorities change if academic incentives directly rewarded real-world impact over benchmark improvements?

Jeremy Howard discusses his path from early programming and music to founding fast. ...

Get the full analysis with uListen AI

What concrete steps could hospitals and regulators take today to unlock medical data for AI while preserving patient privacy and trust?

Get the full analysis with uListen AI

If Swift + MLIR-based stacks mature, how might that alter who can innovate at the algorithmic level compared to today’s Python/CUDA ecosystem?

Get the full analysis with uListen AI

How should societies prepare for and cushion the labor displacement that Howard believes AI will accelerate, especially for the middle class?

Get the full analysis with uListen AI

For a domain expert with weak coding skills, what is the most effective path to becoming self-sufficient with deep learning tools like fast.ai?

Get the full analysis with uListen AI

Transcript Preview

Lex Fridman

The following is a conversation with Jeremy Howard. He's the founder of fast.ai, a research institute dedicated to making deep learning more accessible. He's also a distinguished research scientist at the University of San Francisco, a former president of Kaggle, as well as a top-ranking competitor there. And in general, he's a successful entrepreneur, educator, researcher and an inspiring personality in the AI community. When someone asks me, "How do I get started with deep learning?" fast.ai is one of the top places I point them to. It's free. It's easy to get started. It's insightful and accessible. And if I may say so, it has very little BS that can sometimes dilute the value of educational content on popular topics like deep learning. Fast.AI has a focus on practical application of deep learning and hands-on exploration of the cutting edge that is incredibly both accessible to beginners and useful to experts. This is the Artificial Intelligence podcast. If you enjoy it, subscribe on YouTube, give it five stars on iTunes, support it on Patreon, or simply connect with me on Twitter, @lexfridman, spelled F-R-I-D-M-A-N. And now here's my conversation with Jeremy Howard. What's the first program you've ever written?

Jeremy Howard

First program I wrote, that I remember, would be at high school. Um, (sighs) I did an assignment where I decided to try to find out if there were some, like, better musical scales than the normal 12 tone, 12 s- interval scale. So I wrote a program on my Commodore 64 in Basic that searched through other scale sizes to see if it could find one where there were, uh, more accurate, you know, uh, harmonies.

Lex Fridman

Like mid-tone, like finding-

Jeremy Howard

Like, like you want an actual exactly three to two ratio, whereas with a 12 interval scale it's not exactly three to two, for example. So that's-

Lex Fridman

And, and the Common-

Jeremy Howard

... well tempered as they say in the... (laughs)

Lex Fridman

In Basic on a Commodore 64?

Jeremy Howard

Yeah.

Lex Fridman

Where was the interest in music from? Or is it just technical?

Jeremy Howard

I did music all my life, so I played saxophone and clarinet and piano and guitar and drums and whatever, so...

Lex Fridman

How does that thread go through your life? Where's music today? Is it-

Jeremy Howard

Uh, it's not where I wish it was. I, for various reasons, couldn't really keep it going, particularly because I had a lot of problems with RSI with my fingers, and so I had to kind of like cut back anything that used hands and fingers.

Lex Fridman

Mm-hmm.

Jeremy Howard

Um, I hope one day I'll be able to get back to it healthwise.

Lex Fridman

So there's a love for music underlying it all?

Jeremy Howard

For sure, yeah.

Lex Fridman

What's your favorite instrument?

Jeremy Howard

Uh, saxophone.

Lex Fridman

Sax.

Jeremy Howard

Baritone saxophone. Well, probably bass saxophone but they're awkward.

Lex Fridman

Well, um, I always love it when, uh, music is coupled with programming.

Jeremy Howard

Mm-hmm.

Lex Fridman

There's something about a brain that utilizes those, that, uh, emerges with creative ideas. So you've used and studied quite a few programming languages.

Install uListen to search the full transcript and get AI-powered insights

Get Full Transcript

Get more from every podcast

AI summaries, searchable transcripts, and fact-checking. Free forever.

Add to Chrome