Skip to content
The Twenty Minute VCThe Twenty Minute VC

Eiso Kant, CTO @Poolside: Raising $600M To Compete in the Race for AGI | E1211

Eiso Kant is the Co-Founder and CTO of Poolside, building next-generation AI for software engineering. Just last week, Poolside announced their $500M Series B valuing the company at $3BN. Prior to Poolside, Eiso founded Athenian, a data-enabled engineering platform. Before that, he built source{d} - the world’s first company dedicated to applying AI to code and software. ----------------------------------------------- Timestamps: (00:00) Intro (00:53) What is Poolside? (04:42) Capturing Iterative Thinking in Uncharted Data (08:58) The Biggest Bottleneck in AI Progress: Compute, Data, or Models? (12:49) The Value of Synthetic Data (15:45) Scaling Laws in AI (18:21) Projecting Model Costs Over the Next 12-24 Months (22:10) Future of Model Distillation (29:36) Does Cash Directly Correlate to Compute Access? (31:35) Eiso’s Perspective on Larry Ellison’s $100B Foundation Model Entry Point (36:50) Eiso’s Outlook on Nvidia's Dominance and the Future of Compute (38:51) Has Innovation Stalled Awaiting Nvidia's Blackwell? (46:06) OpenAI, Anthropic, or X.ai — Which to Buy and Why? (51:00) Comparing Crypto & AI: Decentralization vs. Centralization (55:23) The Decision to Stay Europe-Based (59:01) Work Ethic & Work-Life Balance (01:04:53) Is China 2 Years Behind Than Europe? (01:06:48) Quick-Fire Round ----------------------------------------------- In Today’s Episode with Eiso Kant We Discuss: 1. Raising $600M to Compete in the AGI Race: What is Poolside? How does Poolside differentiate from other general-purpose LLMs? How much of Poolside’s latest raise will be spent on compute? How does Eiso feel about large corporates being a large part of startup LLM provider’s funding rounds? Why did Poolside choose to only accept investment from Nvidia? Is $600M really enough to compete with the mega war chests of other LLMs? 2. The Big Questions in AI: Will scaling laws continue? Have we reached a stage of diminishing returns in model performance for LLMs? What is the biggest barrier to the continued improvement in model performance; data, algorithms or compute? To what extent will Nvidia’s Blackwell chip create a step function improvement in performance? What will OpenAI’s GPT5 need to have to be a gamechanger once again? 3. Compute, Chips and Cash: Does Eiso agree with Larry Ellison; “you need $100BN to play the foundation model game”? What does Eiso believe is the minimum entry price? Will we see the continuing monopoly of Nvidia? How does Eiso expect the compute landscape to evolve? Why are Amazon and Google best placed when it comes to reducing cost through their own chip manufacturing? Does Eiso agree with David Cahn @ Sequoia, “you will never train a frontier model on the same data centre twice”? Can the speed of data centre establishment and development keep up with the speed of foundation model development? 4. WTF Happens to The Model Layer: OpenAI and Anthropic… Does Eiso agree we are seeing foundation models become commoditised? What would Eiso do if he were Sam Altman today? Is $6.6BN really enough for OpenAI to compete against Google, Meta etc…? OpenAI at $150BN, Anthropic at $40BN and X.ai at $24BN. Which would Eiso choose to buy and why? ----------------------------------------------- Subscribe on Spotify: https://open.spotify.com/show/3j2KMcZTtgTNBKwtZBMHvl?si=85bc9196860e4466 Subscribe on Apple Podcasts: https://podcasts.apple.com/us/podcast/the-twenty-minute-vc-20vc-venture-capital-startup/id958230465 Follow Harry Stebbings on Twitter: https://twitter.com/HarryStebbings Follow Eiso Kant on Twitter: https://twitter.com/eisokant Follow 20VC on Instagram: https://www.instagram.com/20vchq Follow 20VC on TikTok: https://www.tiktok.com/@20vc_tok Visit our Website: https://www.20vc.com Subscribe to our Newsletter: https://www.thetwentyminutevc.com/contact ----------------------------------------------- #20vc #harrystebbings #eisokant #poolside #ai #software #venturecapital #founder #nvidia #openai #anthropic #meta

Eiso KantguestHarry Stebbingshost
Oct 6, 20241h 19mWatch on YouTube ↗

At a glance

WHAT IT’S REALLY ABOUT

Poolside’s $600M Bet: Synthetic Code Data And Compute For AGI

  1. Eiso Kant, co-founder and CTO of Poolside, explains how the company is using $600M in funding and a 10,000‑GPU cluster to compete in the global race toward AGI, starting with AI for software development.
  2. Poolside’s core thesis is that the main frontier advantage now is not algorithms but data—especially synthetic, reinforcement‑learning data generated from code execution in largely deterministic environments.
  3. Kant argues that compute scale is the entry ticket, but real differentiation comes from proprietary data, applied research, and talent, and that software development will be the first major economically valuable domain to reach near‑human‑level AI capability.
  4. He also discusses the broader AI landscape—hyperscalers, chip ecosystems, China, regulation, and consolidation—while framing AGI as a multi‑decade race where missteps in capabilities or go‑to‑market can permanently knock a company out.

IDEAS WORTH REMEMBERING

5 ideas

Focus on domains where you can simulate feedback to generate massive high‑quality data.

Poolside targets coding because code execution is near‑deterministic; they can run models in huge codebases, execute outputs, and use test results as an objective signal to create synthetic data for both answers and intermediate reasoning.

Compute is table stakes; differentiation comes from proprietary data and applied research.

Everyone is improving algorithms and hardware efficiency, but Kant argues the real moat is in unique datasets (especially synthetic) plus specialized reinforcement‑learning methods, built and iterated by top talent.

Large models are trained for capability; smaller models are distilled for economics.

Frontier labs increasingly train very large, expensive models to reach new capability frontiers, then distill their behavior into smaller, cheaper models that are actually deployed at scale to customers.

Synthetic data only works when paired with a reliable “oracle of truth.”

Having models generate their own training data is useless unless there’s an external signal—like code execution results or human preference labels—to tell the system which outputs are better or correct.

The early AI era is a true race; missteps in capability or GTM can be fatal.

Kant frames AGI as unlike most startups: if Poolside stumbles on model capabilities or go‑to‑market while others advance, they can fall irrecoverably behind, so sustained intensity and focus are non‑negotiable.

WORDS WORTH SAVING

5 quotes

If you can simulate it, you can actually build an extremely large dataset.

Eiso Kant

If you don’t have the compute, you’re not in the race.

Eiso Kant

Most startups are against yourself. But AGI is a race.

Eiso Kant

The world has far more demand for GPU‑like compute than supply that’s available.

Eiso Kant

We are not building the Terminator; we’re building tools that are closing this gap between human capabilities and machine intelligence.

Eiso Kant

Poolside’s mission: AGI via best‑in‑class AI for software developmentSynthetic data and reinforcement learning from code execution feedbackCompute, data, algorithms, and talent as the four pillars of the capabilities raceScaling laws, model size, and distillation economicsGlobal compute supply, chips (NVIDIA, TPUs, Trainium, Blackwell) and data centersMarket structure: hyperscalers, frontier labs, consolidation, and China’s positionTalent strategy, European footprint, and cultural expectations around “race” intensityRegulation, centralization vs decentralization, and societal implications of AGI

High quality AI-generated summary created from speaker-labeled transcript.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome