No PriorsNo Priors Ep. 3 | With Stability AI’s Emad Mostaque
Episode Details
EPISODE INFO
- Released
- May 3, 2023
- Duration
- 45m
- Channel
- No Priors
- Watch on YouTube
- ▶ Open ↗
EPISODE DESCRIPTION
AI-generated images have been everywhere over the past year, but one company has fueled an explosive developer ecosystem around large image models: Stability AI. Stability builds open AI tools with a mission to improve humanity. Stability AI is most known for Stable Diffusion, the AI model where a user puts in a natural language prompt and the AI generates images. But they're also engaged in progressing models in natural language, voice, video, and biology. This week on the podcast, Emad Mostaque joins Sarah Guo and Elad Gil to talk about how this barely one-year-old, London-based company has changed the AI landscape, scaling laws, progress in different modalities, frameworks for AI safety and why the future of AI is open. 00:00 - Introduction 02:00 - Emad’s background as one of the largest investors in video games and artificial intelligence 07:24 - Open-source efforts in AI 13:09 - Stability.AI as the only independent multimodal AI company in the world 15:28 - Computational biology, medical information and medical models 23:29 - Pace of Adoption 26:31 - AGI versus intelligence augmentation 31:38 - Stability.AI’s business model 37:44 - AI Safety
SPEAKERS
Sarah Guo
hostEmad Mostaque
guestElad Gil
host
EPISODE SUMMARY
In this episode of No Priors, featuring Sarah Guo and Emad Mostaque, No Priors Ep. 3 | With Stability AI’s Emad Mostaque explores stability AI’s Emad Mostaque Bets On Open Models Powering Humanity’s Future Emad Mostaque traces his path from hedge funds to founding Stability AI, driven by personal experiences with autism, COVID research, and a belief that AI should be open infrastructure available to everyone. He explains how Stability catalyzed the open-source ecosystem in image, language, and biology models, and why he thinks foundation models will ultimately be open while private firms focus on instruction-tuning and fine-tuning. Mostaque outlines Stability’s multimodal work across media and computational biology, its close cooperation with governments and academia, and its focus on deployable, customizable models rather than ever-larger ones. He also discusses global adoption dynamics, democratic implications, and why we need regulation and standards around large models, data usage, and AI-driven manipulation while prioritizing “intelligence augmentation” over the pursuit of AGI.
RELATED EPISODES
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome




