Elon Musk – "In 36 months, the cheapest place to put AI will be space”

Elon Musk – "In 36 months, the cheapest place to put AI will be space”

Dwarkesh PodcastFeb 5, 20262h 49m

Elon Musk (guest), John (host), Dwarkesh Patel (host), John (host), John (host), John (host), John (host), John (host), Dwarkesh Patel (host)

Orbital data centers and space solar economicsElectricity as the near-term AI bottleneckTurbine supply chain limits (vanes/blades casting)Solar tariffs, permitting, and domestic manufacturing rampTerafab: logic, memory, and packaging at extreme scaleGrok alignment: truth-seeking, anti-lying, interpretability/debuggersOptimus: dexterous hands, custom actuators, recursive manufacturingChina’s manufacturing advantage and US competitivenessSpaceX scaling lessons: bottlenecks, urgency, engineering reviewsDOGE, government fraud, and governance risks from AI

In this episode of Dwarkesh Podcast, featuring Elon Musk and John, Elon Musk – "In 36 months, the cheapest place to put AI will be space” explores musk claims AI’s future bottleneck is power, solved in space Musk’s central claim is that AI scaling will soon be constrained less by GPUs and more by electricity generation, grid interconnects, and physical infrastructure—especially outside China where power growth is flat. He predicts that within ~30–36 months, the cheapest place to run AI will be in space, citing always-on solar, no batteries, and higher solar effectiveness, and he sketches a path that ultimately involves lunar manufacturing and mass drivers.

Musk claims AI’s future bottleneck is power, solved in space

Musk’s central claim is that AI scaling will soon be constrained less by GPUs and more by electricity generation, grid interconnects, and physical infrastructure—especially outside China where power growth is flat. He predicts that within ~30–36 months, the cheapest place to run AI will be in space, citing always-on solar, no batteries, and higher solar effectiveness, and he sketches a path that ultimately involves lunar manufacturing and mass drivers.

He connects this to a broader “hardware wall” thesis: turbines are backlogged, solar is slowed by tariffs/permitting, and utilities move too slowly, so AI labs will hit a point where chips “pile up” but can’t be powered. In that world, competitive advantage shifts to whoever can scale hardware fastest—power plants, solar, rockets, chips, and factories.

The conversation then moves to alignment and product strategy: Musk frames xAI’s mission as “understand the universe,” emphasizing truth-seeking and interpretability-style ‘debuggers’ to detect errors or deception. He predicts near-term ‘digital human emulation’ (agents that can do anything a human at a computer can), and argues humanoid robots (Optimus) are the next super-exponential step—an ‘infinite money glitch’ via recursive manufacturing.

Finally, he discusses geopolitics and governance: China’s manufacturing/energy scale is portrayed as dominant absent US breakthroughs (robots + space scaling). He defends DOGE-style efforts as buying time against debt via cutting waste/fraud, and warns that government misuse is a major AI risk, advocating limited government as a key safeguard.

Key Takeaways

AI scaling will hit a power wall before it hits a chip wall (near-term).

Musk repeatedly argues that concentrated data-center compute is constrained by generation, cooling, margins, and slow utility processes; he predicts that by late-year, many large clusters will struggle simply to “turn the chips on. ...

Get the full analysis with uListen AI

Space becomes economically compelling when launch costs fall enough.

He claims space solar is ~5× more effective than ground solar (no atmosphere/clouds/seasons) and can be ~10× cheaper system-wide by eliminating batteries. ...

Get the full analysis with uListen AI

Terrestrial scaling is slowed by permitting, tariffs, and industrial backlogs.

He points to difficulty permitting massive solar deployments, large US solar import tariffs, and gas turbine shortages—especially specialized turbine vanes/blades casting capacity, with only a few global suppliers and backlogs to ~2030.

Get the full analysis with uListen AI

If space unlocks power, the next binding constraint becomes chips—especially memory.

Musk says once power is abundant (via space solar), logic and memory production become the limiter; he highlights memory as the harder path and cites soaring DDR prices as a signal. ...

Get the full analysis with uListen AI

The business-endgame is ‘digital human emulation’ and then robots.

He predicts “digital human emulation” could be solved soon, enabling AI to replace/augment any computer-based job (e. ...

Get the full analysis with uListen AI

Optimus’ hardest problems are intelligence, the hand, and scaling manufacturing.

Musk claims the hand is harder than the rest of the robot combined and says Optimus requires custom actuators, motors, gears, electronics, and sensors with little existing supply chain. ...

Get the full analysis with uListen AI

Alignment approach: avoid ‘AI lying,’ prioritize truth-seeking, and build debuggers.

Musk argues “political correctness” can force models into contradictions that resemble ‘lying,’ risking instability; he cites HAL in *2001* as a cautionary tale. ...

Get the full analysis with uListen AI

Geopolitics: China dominates manufacturing scale unless the US gets a robotics breakthrough.

He uses electricity output as a proxy for industrial capacity and claims China’s is rapidly rising (including dominance in refining like gallium). ...

Get the full analysis with uListen AI

Management philosophy: speed comes from attacking the limiting factor with high urgency.

Musk describes weekly/twice-weekly deep engineering reviews, skip-level reporting, and open-ended problem-solving when needed. ...

Get the full analysis with uListen AI

Notable Quotes

My prediction is that… it will be by far the cheapest place to put AI… will be space, in thirty-six months or less, maybe thirty months.

Elon Musk

People are going to hit the wall big time on power generation. They already are.

Elon Musk

Reality… is the best verifier.

Elon Musk

Optimus [is] the infinite money glitch.

Elon Musk

I think, in the absence of… breakthrough innovations… China will… utterly dominate.

Elon Musk

Questions Answered in This Episode

What specific launch-cost threshold makes “space AI” cheaper than Earth-based power + cooling + land, and what assumptions (capex, radiator mass, servicing) are most uncertain?

Musk’s central claim is that AI scaling will soon be constrained less by GPUs and more by electricity generation, grid interconnects, and physical infrastructure—especially outside China where power growth is flat. ...

Get the full analysis with uListen AI

Musk claims space solar is ~5× more effective and ~10× cheaper without batteries—what’s the explicit LCOE model behind that, including radiators, shielding, and downlink/latency constraints?

He connects this to a broader “hardware wall” thesis: turbines are backlogged, solar is slowed by tariffs/permitting, and utilities move too slowly, so AI labs will hit a point where chips “pile up” but can’t be powered. ...

Get the full analysis with uListen AI

He predicts 10,000 Starship launches/year (≈1/hour) for 100 GW/year in space—what are the gating constraints: range safety, propellant production, pad turnaround, or vehicle lifetime?

The conversation then moves to alignment and product strategy: Musk frames xAI’s mission as “understand the universe,” emphasizing truth-seeking and interpretability-style ‘debuggers’ to detect errors or deception. ...

Get the full analysis with uListen AI

If heat-shield reusability is Starship’s biggest bottleneck, what measurable milestone would convince you it’s solved (tile loss rate, turnaround hours, reflight count)?

Finally, he discusses geopolitics and governance: China’s manufacturing/energy scale is portrayed as dominant absent US breakthroughs (robots + space scaling). ...

Get the full analysis with uListen AI

On turbines: if vanes/blades casting is the choke point, what would an “internal blade program” actually require (metallurgy, single-crystal casting, QA), and what’s the realistic timeline?

Get the full analysis with uListen AI

Transcript Preview

Elon Musk

So are, are there really three hours of questions, or, or how's it-

John

Yeah.

Elon Musk

Are you fucking serious?

John

Yeah. [laughing] You don't think there's a lot to talk about, Elon?

Elon Musk

Holy shit, man. [laughing]

John

I mean, it's the most interesting point. All the storylines are kind of converging-

Elon Musk

Yeah

John

-right now, so we'll, we'll see how much-

Elon Musk

It's almost like I planned it.

John

Exactly. [laughing] Well, we're getting there.

Elon Musk

I would never do such a thing. [laughing]

Dwarkesh Patel

So as you know better than anybody else, uh, the total cost of ownership of a data center, only ten to fifteen percent is energy, and that's the part you're presumably saving by moving this into space. Most of it's the GPUs. If they're in space, it's harder to service them or you can't service them, and so the depreciation cycle goes down on them. So like, it's just way more expensive to have the GPUs in space, pr- presumably. What's the reason to put them in space?

Elon Musk

Um, well, the availability of energy is the issue. Um, so, uh, I mean, if you look at, at electrical output, um, outside of China, everywhere outside of China, it's more or less flat. It's very, you know, maybe a slight increase, but pr- pretty much flat. China has a rapid increase in el- in electrical output. But if you're putting data centers anywhere except China, where are you going to get your electricity, um, especially as you scale? Uh, the output of chips is growing, um, pretty much exponentially, but the output of electricity is flat. So where- how are you going to turn the chips on?

Dwarkesh Patel

Um, uh, you know-

Elon Musk

Magical power sources? Magical electricity fairies? [laughing]

Dwarkesh Patel

You, I mean, you're famously, you're, you're famously a big fan of solar. One terawatt of solar power, so with a twenty-five percent capacity factor, like four terawatts of solar panels, it's like one percent of the land area of the United States, and that's like far... In the- we're in the singularity when we've got one terawatt of data centers, right? Um, so what are we running out of exactly?

Elon Musk

How far into the singularity are you, though? [laughing]

Dwarkesh Patel

You tell me.

Elon Musk

Yeah, exactly. So, so I think, I think we'll, [chuckles] we'll find we're in the singularity and like, "Oh, okay, we've still got a long way to go." [laughing]

Dwarkesh Patel

But is this like a- is the plan to, like, put it into space after we've covered Nevada in solar panels?

Elon Musk

I think it's pretty hard to cover Nevada in solar panels. You have to get, like, permits from, like, the permits for... Try getting the permits for that. See what happens. [laughing]

Dwarkesh Patel

So space is really a reg- it's really a regulatory play. It's, like, harder to, harder to build on land than it is in space.

Elon Musk

It's, it's harder to scale, um, on the ground than it is to scale in space. Uh, but, but also, the, the- y- you're going to get about five times the, um, effectiveness of solar panels in space versus the ground, and you don't need batteries. Um, I almost wore my other shirt, which says, "It's always sunny in space," which it is. [laughing] So, um, because you don't have a day/night cycle or, uh, seasonality, uh, clouds, uh, or, or an atmosphere in space, uh, because the atmosphere alone, um, uh, results in about a thirty percent, uh, lo- loss of energy. Um, so, uh, so you can- for any given, uh, solar panels can do about five times more, uh, power in space than on the ground, and you avoid the cost of having batteries to carry you through the night. Uh, so it's, it's actually much cheaper to do it in space. And I, I- my prediction is that, um, it will be by far the cheapest place to put, uh, AI, will be space, in thirty-six months or less, maybe thirty months.

Install uListen to search the full transcript and get AI-powered insights

Get Full Transcript

Get more from every podcast

AI summaries, searchable transcripts, and fact-checking. Free forever.

Add to Chrome