
Groq Founder, Jonathan Ross: OpenAI & Anthropic Will Build Their Own Chips & Will NVIDIA Hit $10TRN
Jonathan Ross (guest), Harry Stebbings (host)
In this episode of The Twenty Minute VC, featuring Jonathan Ross and Harry Stebbings, Groq Founder, Jonathan Ross: OpenAI & Anthropic Will Build Their Own Chips & Will NVIDIA Hit $10TRN explores groq’s Jonathan Ross: Compute, Chips, and Energy Will Dictate AI’s Future Jonathan Ross argues that AI is still in its early, value-creating phase, with insatiable demand for compute and a structural shortage of GPUs and energy. He believes hyperscalers and AI labs are rationally overspending because compute directly translates into better products, more revenue, and long‑term strategic survival. Ross predicts NVIDIA will remain dominant and could exceed a $10T valuation, even as OpenAI, Anthropic, and others build their own chips and alternative inference architectures like Groq’s emerge. He warns that nations and companies that fail to secure cheap energy and abundant compute—especially Europe—risk becoming economic backwaters in an AI-driven global economy.
Groq’s Jonathan Ross: Compute, Chips, and Energy Will Dictate AI’s Future
Jonathan Ross argues that AI is still in its early, value-creating phase, with insatiable demand for compute and a structural shortage of GPUs and energy. He believes hyperscalers and AI labs are rationally overspending because compute directly translates into better products, more revenue, and long‑term strategic survival. Ross predicts NVIDIA will remain dominant and could exceed a $10T valuation, even as OpenAI, Anthropic, and others build their own chips and alternative inference architectures like Groq’s emerge. He warns that nations and companies that fail to secure cheap energy and abundant compute—especially Europe—risk becoming economic backwaters in an AI-driven global economy.
Key Takeaways
Stop asking if AI is a bubble; follow where committed capital and hyperscalers are going.
Ross says major tech firms and nations are repeatedly increasing AI CapEx and already seeing real returns (e. ...
Get the full analysis with uListen AI
Compute is the binding constraint for leading AI labs, not demand.
He claims that if OpenAI or Anthropic doubled their inference compute, their revenue would almost double within a month, because today they are limited by rate caps, latency management, and lack of capacity—not by user willingness to pay.
Get the full analysis with uListen AI
Speed and low latency are deep economic moats, not UX niceties.
Drawing on CPG and web metrics, Ross notes that faster responses correlate with higher margins and conversions; making models instant or near-instant increases engagement, brand affinity, and competitive differentiation.
Get the full analysis with uListen AI
Vertical integration into chips is about supply security more than beating NVIDIA on performance.
OpenAI, Anthropic, and hyperscalers want their own chips largely to control allocation and de-risk NVIDIA’s HBM bottleneck and monopsony, even if their chips are slightly worse or more expensive, because predictability of supply is strategically invaluable.
Get the full analysis with uListen AI
Energy policy will determine which countries lead in AI.
Ross insists that ‘countries that control compute will control AI’ and compute is impossible without abundant, cheap energy; he argues allies could out‑energy China by siting data centers where renewables and nuclear are plentiful, but Europe in particular is moving too slowly.
Get the full analysis with uListen AI
Inference‑optimized architectures like Groq’s can expand total demand, not replace GPUs.
He believes Groq and other inference specialists will take large volume share on cost and availability, but this will likely increase, not reduce, demand for GPUs for training, reinforcing a virtuous cycle between training and inference.
Get the full analysis with uListen AI
AI will likely create deflation, new industries, and labor shortages—not mass permanent unemployment.
Ross expects AI to make goods and services cheaper, allow people to work less, and simultaneously spawn whole new categories of work, analogizing modern AI to the shift from 98% of US workers in agriculture to a diversified economy.
Get the full analysis with uListen AI
Notable Quotes
“The countries that control compute will control AI, and you cannot have compute without energy.”
— Jonathan Ross
“I would be surprised if in five years NVIDIA wasn’t worth 10 trillion.”
— Jonathan Ross
“If OpenAI or Anthropic were given twice the inference compute they have today, within one month their revenue would almost double.”
— Jonathan Ross
“CUDA lock‑in is bullshit. It’s true for training, but it’s not true for inference.”
— Jonathan Ross
“LLMs are the telescope of the mind… right now they make us feel really small, but we’ll eventually realize intelligence is more vast than we imagined—and think that’s beautiful.”
— Jonathan Ross
Questions Answered in This Episode
If compute is the primary bottleneck today, what specific breakthrough or policy change could most rapidly unlock additional capacity?
Jonathan Ross argues that AI is still in its early, value-creating phase, with insatiable demand for compute and a structural shortage of GPUs and energy. ...
Get the full analysis with uListen AI
How realistic is Ross’s view that AI will cause labor shortages rather than structural unemployment, especially for non‑technical or low‑skill workers?
Get the full analysis with uListen AI
Can Europe practically overcome its permitting, energy, and political constraints fast enough to avoid becoming a ‘tourist economy’ in an AI-first world?
Get the full analysis with uListen AI
At what point do alternative inference architectures meaningfully shift pricing power away from NVIDIA, despite its entrenched brand and HBM supply advantage?
Get the full analysis with uListen AI
How should investors differentiate between durable, moat‑building AI infrastructure plays and short‑lived ‘vibe coding’ or application layer experiments that are easily subsumed by the labs?
Get the full analysis with uListen AI
Transcript Preview
The countries that control compute will control AI, and you cannot have compute without energy.
So I'm thrilled to welcome Jonathan Ross, founder and CEO at Grok, back to the hot seat.
And now we're going to be able to add more labor to the economy by producing more compute and better AI. That has never happened in the history of the economy before. What is that gonna do? I personally would be surprised if in five years NVIDIA wasn't worth 10 trillion, but I can't predict the outcome. The demand for compute is insatiable. If OpenAI were given twice the inference compute that they have today, if Anthropic was given twice the inference compute that they have today, within one month from now, their revenue would almost double.
I'm sorry, can you unpa that for me? (dramatic music) Ready to go? Jonathan, you've just been told by our team that our last show was the most successful of, uh, the year when it came out, so there's no pressure at all that this is gonna be the most successful of this year. But welcome to the studio, man. (laughs)
Thank you.
It's great to have you here, dude. Now, I, I wanted to start with a understanding of where we are. It seems the world moves faster than ever before, and honestly, I think a lot of us are trying to understand where everyone lies in a new market. If we'd look at the current state of the market today, how do you analyze it?
Are you asking is there a bubble?
Relatively.
Okay. So, (laughs) um, in terms of whether or not there's a bubble, uh, my answer is if you ask a question and you keep not getting an answer, maybe you should ask a different question. And so instead of asking, "Is there a bubble?" you should ask, "What is the smart money doing?" So what is Google doing? What is Microsoft doing? Amazon? What are some nations doing? And they're all doubling down on AI. They're spending more. Um, y- like, every time they make an announcement on how much they're spending, it goes up the next time. And one of the best examples of the value that's coming from this spend, Microsoft in one quarter deployed a bunch of GPUs and then announced that they weren't going to make them available in Azure because they made more money using them themselves than renting them out. So there's real money in the market, and the best way that I, I think to explain this market is like the early days of oil drilling, a lot of dry holes and a couple of gushers. I think the stat that I heard was, um, 35, uh, companies or 36 companies are responsible for 99% of the revenue, uh, or at least the token spend, um, in AI right now. Yeah. It's very lumpy. And so-
I'm surprised it's not less.
Install uListen to search the full transcript and get AI-powered insights
Get Full TranscriptGet more from every podcast
AI summaries, searchable transcripts, and fact-checking. Free forever.
Add to Chrome