The Twenty Minute VCGroq Founder, Jonathan Ross: OpenAI & Anthropic Will Build Their Own Chips & Will NVIDIA Hit $10TRN
At a glance
WHAT IT’S REALLY ABOUT
Groq’s Jonathan Ross: Compute, Chips, and Energy Will Dictate AI’s Future
- Jonathan Ross argues that AI is still in its early, value-creating phase, with insatiable demand for compute and a structural shortage of GPUs and energy. He believes hyperscalers and AI labs are rationally overspending because compute directly translates into better products, more revenue, and long‑term strategic survival. Ross predicts NVIDIA will remain dominant and could exceed a $10T valuation, even as OpenAI, Anthropic, and others build their own chips and alternative inference architectures like Groq’s emerge. He warns that nations and companies that fail to secure cheap energy and abundant compute—especially Europe—risk becoming economic backwaters in an AI-driven global economy.
IDEAS WORTH REMEMBERING
5 ideasStop asking if AI is a bubble; follow where committed capital and hyperscalers are going.
Ross says major tech firms and nations are repeatedly increasing AI CapEx and already seeing real returns (e.g., Microsoft earning more by using GPUs internally than renting them out), suggesting durable value despite lumpy outcomes and speculative noise.
Compute is the binding constraint for leading AI labs, not demand.
He claims that if OpenAI or Anthropic doubled their inference compute, their revenue would almost double within a month, because today they are limited by rate caps, latency management, and lack of capacity—not by user willingness to pay.
Speed and low latency are deep economic moats, not UX niceties.
Drawing on CPG and web metrics, Ross notes that faster responses correlate with higher margins and conversions; making models instant or near-instant increases engagement, brand affinity, and competitive differentiation.
Vertical integration into chips is about supply security more than beating NVIDIA on performance.
OpenAI, Anthropic, and hyperscalers want their own chips largely to control allocation and de-risk NVIDIA’s HBM bottleneck and monopsony, even if their chips are slightly worse or more expensive, because predictability of supply is strategically invaluable.
Energy policy will determine which countries lead in AI.
Ross insists that ‘countries that control compute will control AI’ and compute is impossible without abundant, cheap energy; he argues allies could out‑energy China by siting data centers where renewables and nuclear are plentiful, but Europe in particular is moving too slowly.
WORDS WORTH SAVING
5 quotesThe countries that control compute will control AI, and you cannot have compute without energy.
— Jonathan Ross
I would be surprised if in five years NVIDIA wasn’t worth 10 trillion.
— Jonathan Ross
If OpenAI or Anthropic were given twice the inference compute they have today, within one month their revenue would almost double.
— Jonathan Ross
CUDA lock‑in is bullshit. It’s true for training, but it’s not true for inference.
— Jonathan Ross
LLMs are the telescope of the mind… right now they make us feel really small, but we’ll eventually realize intelligence is more vast than we imagined—and think that’s beautiful.
— Jonathan Ross
High quality AI-generated summary created from speaker-labeled transcript.
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome