No PriorsNo Priors Ep. 79 | With Magic.dev CEO and Co-Founder Eric Steinberger
Episode Details
EPISODE INFO
- Released
- August 30, 2024
- Duration
- 37m
- Channel
- No Priors
- Watch on YouTube
- ▶ Open ↗
EPISODE DESCRIPTION
Today on No Priors, Sarah Guo and Elad Gil are joined by Eric Steinberger, the co-founder and CEO of Magic.dev. His team is developing a software engineer co-pilot that will act more like a colleague than a tool. They discussed what makes Magic stand out from the crowd of AI co-pilots, the evaluation bar for a truly great AI assistant, and their predictions on what a post-AGI world could look like if the transition is managed with care. Sign up for new podcasts every week. Email feedback to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @EricSteinb Show Notes: 0:00 Introduction 0:45 Eric’s journey to founding Magic.dev 4:01 Long context windows for more accurate outcomes 10:53 Building a path toward AGI 15:18 Defining what is enough compute for AGI 17:34 Achieving Magic’s final UX 20:03 What makes a good AI assistant 22:09 Hiring at Magic 27:10 Impact of AGI 32:44 Eric’s north star for Magic 36:09 How Magic will interact in other tools
SPEAKERS
Sarah Guo
hostEric Steinberger
guestElad Gil
host
EPISODE SUMMARY
In this episode of No Priors, featuring Sarah Guo and Eric Steinberger, No Priors Ep. 79 | With Magic.dev CEO and Co-Founder Eric Steinberger explores magic.dev’s Erik Steinberger on AGI, coding coworkers, and safety Erik Steinberger, CEO and co-founder of Magic.dev, discusses building an AI software engineer that functions as a true colleague rather than a lightweight coding assistant. He explains why Magic focuses on code-generation as a pathway to AGI, emphasizing long-context models, test-time compute, and an architecture tuned for continual learning from large histories of work. The conversation explores how to allocate compute between training and inference, why long context can outperform retrieval, and how recursive self-improvement could be bounded by safety-focused iterations. Steinberger also reflects on the societal impacts of AGI, from automation of most digital work to questions of meaning, power, and how capitalism and competition might shape outcomes.
RELATED EPISODES
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome




