Cerebras CEO, Andrew Feldman on Why Raise $1BN and Delay the IPO & Why NVIDIA’s Worried About Growth

Cerebras CEO, Andrew Feldman on Why Raise $1BN and Delay the IPO & Why NVIDIA’s Worried About Growth

The Twenty Minute VCOct 6, 20251h 20m

Andrew Feldman (guest), Harry Stebbings (host), Harry Stebbings (host)

Cerebras’ $1B pre-IPO raise and financing strategy with Fidelity and othersUnprecedented AI demand, capacity planning under extreme uncertainty, and ‘options on the future’NVIDIA’s dominance, growth limits, pricing power, and incumbent defensive tacticsChip economics: depreciation cycles, performance plateaus, memory bottlenecks, and wafer-scale SRAMBottlenecks in AI: talent, fabs (TSMC/Samsung), data centers, power infrastructure, and capitalVertical vs horizontal AI stacks, custom chips for model labs, and why software firms struggle at siliconGeopolitics and policy: US–China AI race, Gulf-region buildout, immigration, and university computeLabor, education, and productivity: how AI will reshape entry-level work and learning over time

In this episode of The Twenty Minute VC, featuring Andrew Feldman and Harry Stebbings, Cerebras CEO, Andrew Feldman on Why Raise $1BN and Delay the IPO & Why NVIDIA’s Worried About Growth explores cerebras CEO: Beating NVIDIA, Billion-Dollar Bets, And AI’s Future Andrew Feldman, CEO of Cerebras, explains why the company raised a $1B pre-IPO round, emphasizing capital to scale manufacturing, data centers, and ambitious AI hardware R&D while delaying going public. He argues that AI demand is vastly underestimated, with customers unable to forecast usage even within an order of magnitude, making capacity ‘options on the future.’

Cerebras CEO: Beating NVIDIA, Billion-Dollar Bets, And AI’s Future

Andrew Feldman, CEO of Cerebras, explains why the company raised a $1B pre-IPO round, emphasizing capital to scale manufacturing, data centers, and ambitious AI hardware R&D while delaying going public. He argues that AI demand is vastly underestimated, with customers unable to forecast usage even within an order of magnitude, making capacity ‘options on the future.’

Feldman offers a critical view of NVIDIA’s dominance, predicting limits to its growth and describing classic incumbent behaviors like buying business with investments and using pre-announcements to slow competitors. He dives into chip economics—depreciation, memory bottlenecks, wafer-scale design, and margins—framing Cerebras as faster on both training and inference, but acknowledging software and adoption frictions.

Beyond chips, he explores systemic constraints in AI: talent shortages, fabrication and data center bottlenecks, power-location mismatches, and underinvestment in unglamorous areas like data pipelines. He also reflects on geopolitical dynamics (US–China, Gulf states), immigration, and how AI will diffuse slowly but deeply into productivity, education, and white-collar work.

Throughout, Feldman underscores the value of extraordinary talent, the risks of mispriced concentration in Mag 7 stocks, and the reality that building AI hardware is a long-horizon, brutally hard, experience-driven game where most newcomers and naïve capital will be wiped out.

Key Takeaways

Use late-stage capital to buy strategic flexibility, not just runway.

Cerebras’ $1B raise, led by Fidelity, is about scaling manufacturing, adding data centers, and funding non-incremental R&D while retaining the option to IPO later; securing blue-chip public investors pre-IPO also sends a strong signaling effect to future public markets.

Get the full analysis with uListen AI

Treat AI infrastructure commitments as options on an uncertain future.

Customers requesting anywhere from 5–40M queries per second show that demand forecasts are off by orders of magnitude; large ‘up to’ multi-year deals are less firm orders and more capacity options in a rapidly shifting environment.

Get the full analysis with uListen AI

Depreciation of AI chips hinges on solution-level speedups, not marketing specs.

Feldman stresses that useful chip life depends on how much faster complete systems (including memory bandwidth and power efficiency) become; modest 2–2. ...

Get the full analysis with uListen AI

Memory, not raw FLOPS, is the key bottleneck in AI hardware.

GPU architectures are constrained by slow high-capacity HBM, so Cerebras built wafer-scale chips packed with fast SRAM to keep models on-chip; the move sounds obvious but had been technically impossible for 75 years and required solving a fundamental manufacturing problem.

Get the full analysis with uListen AI

Expect incumbent behavior from NVIDIA: buying demand and pre-announcing roadmaps.

Feldman interprets NVIDIA’s massive strategic investments (e. ...

Get the full analysis with uListen AI

The biggest AI constraints are human expertise, fabrication capacity, and power siting.

There aren’t enough skilled AI practitioners or data engineers, fabs like TSMC and Samsung can’t build plants fast enough, and the US has plenty of power but in the wrong places relative to fiber and people; each of these creates friction in scaling AI.

Get the full analysis with uListen AI

Extraordinary technical talent is worth extreme compensation; mediocrity is what kills companies.

Feldman argues no company goes bankrupt overpaying truly exceptional people but many die overpaying mediocre ones; in AI, a single world-class scientist or architect can create value that thousands of ordinary engineers cannot match.

Get the full analysis with uListen AI

Software-first AI labs will struggle to build competitive chips from scratch.

History shows many software giants fail at silicon because chip development requires multi-year horizons, ‘measure twice, cut once’ culture, and deep hardware experience that conflicts with agile software norms; Apple, Amazon, and Google succeeded mainly via acquisitions and dedicated, insulated teams.

Get the full analysis with uListen AI

AI’s economic impact will be slow-burn but profound, especially in education and entry-level work.

Like electrification and early computers, AI will take years before major productivity jumps show up in statistics; Feldman expects big changes in how we teach (error-driven, personalized curricula) and in what entry-level white-collar roles look like as rote spreadsheet and synthesis work is automated.

Get the full analysis with uListen AI

Mag 7 concentration turns ‘diversified’ indices into hidden sector bets.

With a huge share of the S&P 500 now concentrated in a handful of AI-levered tech giants, investors who think they hold broad market exposure may actually be overexposed to one sector; the real risk is misperception and mispriced concentration, not the absolute value of those firms.

Get the full analysis with uListen AI

Notable Quotes

There is unbelievable demand and nobody knows where it will go in the future.

Andrew Feldman

If NVIDIA keeps growing at the rate they're currently growing, 11 years from now, everybody on Earth works for them.

Andrew Feldman

The question of depreciation is how much faster are future generations than the current generation? That's the actual question on depreciation.

Andrew Feldman

No company ever went bankrupt by paying extraordinary people too much. If you want to go bankrupt, pay mediocre people too much.

Andrew Feldman

We have just solved the problem that for 75 years the smartest people in our industry had been unable to solve. And we have done it.

Andrew Feldman

Questions Answered in This Episode

How sustainable is NVIDIA’s current pricing power and margin structure as more players like Cerebras, AMD, and hyperscalers ramp their own silicon?

Andrew Feldman, CEO of Cerebras, explains why the company raised a $1B pre-IPO round, emphasizing capital to scale manufacturing, data centers, and ambitious AI hardware R&D while delaying going public. ...

Get the full analysis with uListen AI

In an environment where demand is so uncertain, how should enterprises decide how much ‘optionality’ to buy in AI infrastructure versus waiting for cheaper, better generations?

Feldman offers a critical view of NVIDIA’s dominance, predicting limits to its growth and describing classic incumbent behaviors like buying business with investments and using pre-announcements to slow competitors. ...

Get the full analysis with uListen AI

What specific policy changes around immigration, university compute, and power infrastructure would most effectively unlock AI innovation in the US?

Beyond chips, he explores systemic constraints in AI: talent shortages, fabrication and data center bottlenecks, power-location mismatches, and underinvestment in unglamorous areas like data pipelines. ...

Get the full analysis with uListen AI

Given the difficulty software companies have historically had building chips, what realistic models exist for OpenAI or Anthropic to secure long-term hardware sovereignty?

Throughout, Feldman underscores the value of extraordinary talent, the risks of mispriced concentration in Mag 7 stocks, and the reality that building AI hardware is a long-horizon, brutally hard, experience-driven game where most newcomers and naïve capital will be wiped out.

Get the full analysis with uListen AI

Which ‘unsexy’ parts of the AI stack—like data cleaning and pipelines—are most likely to become major value pools or moats over the next five years?

Get the full analysis with uListen AI

Transcript Preview

Andrew Feldman

Things are moving at a rate that six, eight, 12 months out, everybody's unsure. It's so fast, it's so big. There is unbelievable demand and nobody knows where it will go in the future. The question of depreciation is how much faster are future generations than the current generation? That's the actual question on depreciation. People often say we don't have enough power in the US, and this is strictly wrong. We have plenty of power. It's in the wrong places. Risk comes in financial markets where people fundamentally underestimate risk. No company ever went bankrupt by paying extraordinary people too much.

Harry Stebbings

Ready to go? Andrew, dude, it is so lovely to have you back on. I so enjoyed our first show. You put up with my naive questions enough to agree to do a round two. Man, I must be charming. (laughs)

Andrew Feldman

H- h- Harry, I'm, uh, uh, I'm okay with any questions, naive or otherwise. So I'm, uh, ha-happy to do it anytime. I, I read your, your, uh, your LinkedIn posts, your Twitter posts. Uh, I, uh, I'm rooting for your mom. I mean, it... All good. All good.

Harry Stebbings

Dude, you are too kind. Uh, listen, I wanna start with the billion dollar raise that you just announced yesterday. Um, can you just talk to me about the billion dollar raise, why it's important, why now, and what it means for the company?

Andrew Feldman

Look, it was the largest raise ever done in, in our category. Uh, it was done at the highest valuation and with the, the premier investors. So, uh, at, at late stage investing, you're looking for, uh, the likes of Fidelity. They are the, uh... What would the English call it? The sort of Oxford or Cambridge of investing, right? (laughs) I mean, they, they are the, uh, the premier, uh, public market investors. And when they choose to lead a round, uh, it, it brings the, uh, Wall Street a great deal of confidence. And so, uh, we were really happy to partner with them and with the treaties to lead the round, and then we, uh, were able to get enormous participation from Tiger Global, from Valor, from 1789. So that's point one. I think point two is that, um, w- we, we now have sort of the dry powder to, uh, to really push and to take the opportunities in front of us, uh, to build out our manufacturing to, to the scale and scope we want, to add new data centers. We added five this year in the US to add more data centers. And we have more big ideas, right? I think incremental improvements, uh, uh, m- make-believe gains achieved by dropping from, from, you know, 8-bit to 4-bit, uh, th- those aren't gonna get us to, uh, to the promised land in AI. We, we need... We've got real work to do as a community. And I, I, I think this, this funding puts us in the catbird seat for that.

Install uListen to search the full transcript and get AI-powered insights

Get Full Transcript

Get more from every podcast

AI summaries, searchable transcripts, and fact-checking. Free forever.

Add to Chrome