Skip to content
No PriorsNo Priors

AI for Atoms: How Periodic Labs is Revolutionizing Materials Engineering with Co-Founder Liam Fedus

What happens when you apply the scaling laws of large language models to the physical work of atoms? Elad Gil sits down with Liam Fedus, co-founder at Periodic Labs, which is pioneering an AI foundation lab for atoms. Liam discusses how he pivoted from dark matter physics research to the front lines of artificial intelligence, including stints at Google Brain and working on ChatGPT at OpenAI. He talks about how Periodic is connecting massive language models to the physical world to overcome data bottlenecks in material science. Liam also shares how they use language models as an orchestration layer operating alongside specialized neural nets to run closed-loop physical experiments. They also explore the future of AGI and ASI, as well as the role of robotics in lab automation. Sign up for new podcasts every week. Email feedback to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @LiamFedus | @periodiclabs Chapters: 00:00 – Cold Open 00:05 – Liam Fedus Introduction 00:39 – Liam’s Background at Google Brain, OpenAI 05:14 – From ChatGPT to Materials and Atoms 06:34 – Training Data in the Physical World 09:52 – Generalization Across Domains 11:31 – Models as an Orchestration Layer 12:48 – Commercialization and Business Model 16:10 – How Periodic’s Success May Shape the Future 17:45 – Multidisciplinary Scaling 19:41 – Capital and Compute 21:12 – Hiring at Periodic 21:44 – Thoughts on AGI and ASI 23:30 – Timeline for Machine-Directed Self-Improvement 25:39 – Automation and Data Generation 27:59 – Why Liam is Excited About the Future of Robotics 29:25 – Conclusion

Elad Gilhost
Apr 3, 202629mWatch on YouTube ↗

CHAPTERS

  1. Why AI Needs the Physical World: Periodic Labs’ “Foundation Model for Atoms”

    Elad Gil introduces Liam Fedus and frames Periodic Labs’ core thesis: AI won’t meaningfully accelerate science without being connected to real experiments and physical reality. The conversation sets up materials engineering as the next major frontier beyond language and code.

  2. From Physics to AI: Dark Matter Research and Why Physicists Flood Into ML

    Liam recounts his physics background, including dark matter research, and discusses why physics training maps well to modern AI work. They connect this to a broader migration of high-energy physicists into machine learning after major milestones like the Higgs discovery.

  3. Google Brain in 2016–2017: The Cambrian Era of Scaling, Sparsity, and Transformers

    Liam describes joining Google Brain during a formative period when key ideas like distributed training and early scaling approaches were emerging. He highlights the research culture of small teams pushing boundaries with limited hardware compared to today’s industrial-scale AI labs.

  4. OpenAI and the Road to ChatGPT: Turning GPT-4 Into a Product

    Liam explains the practical challenge of productizing GPT-4 and how internal debates shaped the initial product direction. He notes that the “general chatbot” approach won out over narrower ideas, catalyzing mainstream awareness of AI capabilities.

  5. From ChatGPT to Atoms: Why Materials Needed Better Reasoning and Tool Use

    Liam outlines why 2022-era models were insufficient for serious physical-world science and why subsequent improvements changed the feasibility. He emphasizes reasoning, test-time inference, error correction, and tool use as prerequisites for closed-loop experimentation.

  6. The Data Problem in Science: Simulation vs. Experiment and the Need for Ground Truth

    They unpack how data scarcity and noisiness in materials science differs from internet-scale language data. Liam describes the limits of literature-derived values and why experiments are required to ground models in reality.

  7. Closed-Loop Discovery: Turning Experiments Into an Active Learning Engine

    Liam argues the breakthrough is not just collecting a dataset but building an interactive loop that decides what to test next. The system uses experimental results to find inconsistencies, reconcile sources, and guide subsequent experiments for faster discovery.

  8. Generalization Boundaries: When Quantum-Driven Models Transfer—and When They Don’t

    The conversation explores how far learned representations can generalize across scientific domains. Liam notes meaningful transfer within quantum-mechanical regimes, but limited transfer to domains governed by different abstractions like fluid dynamics.

  9. System Architecture: LLMs as Orchestration Layer + Specialized Atomic Models

    Liam describes Periodic’s architecture: language models provide a natural interface and planning layer, while specialized neural nets handle atomic symmetry and fast, low-latency predictions. The overall system routes between literature, experimental data, and domain tools.

  10. Commercialization Strategy: Intelligence Layer for Advanced Manufacturing

    Elad presses on go-to-market: why materials isn’t as universally “plug-and-play” as language. Liam positions Periodic initially as a software/intelligence layer for companies bottlenecked by materials and process engineering, with potential to evolve toward a discovery-style value capture model.

  11. If Periodic Works: “Generating Matter,” Faster Physical Progress, and a New Industrial Revolution

    Using sci-fi as a prompt (The Diamond Age), they discuss what a successful “atomic rearrangement” platform could mean for industries and daily life. Liam predicts meaningful speedups despite physical constraints, creating a step-change in pace for semiconductors, energy, aerospace, and manufacturing.

  12. Scaling the Lab: Multidisciplinary Teams, Capital/Compute, Hiring, and Robotics as the Next Interface

    Liam highlights multidisciplinary collaboration as essential, then details practical constraints: compute is expensive and often dominates cost, while physical automation has lead times and reliability challenges. They close on hiring needs and why better robotics would dramatically accelerate closed-loop experimentation and broader physical-world AI.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome