Skip to content
a16za16z

Marc Andreessen's 2026 Outlook: AI Timelines, US vs. China, and The Price of AI

a16z cofounder Marc Andreessen joins an AMA-style conversation to explain why AI is the largest technology shift he has experienced, how the cost of intelligence is collapsing, and why the market still feels early despite rapid adoption. The discussion covers how falling model costs and fast capability gains are reshaping pricing, distribution, and competition across the AI stack, why usage-based and value-based pricing are becoming standard, and how startups and incumbents are navigating big versus small models and open versus closed systems. Marc also addresses China’s progress, regulatory fragmentation, lessons from Europe, and why venture portfolios are designed to back multiple, conflicting outcomes at once. Timestamps: 0:00 — Introduction 1:51 — What Inning Are We In? How Early the AI Shift Really Is 9:11 — Revenue Growth vs. Burn: Can AI Companies Scale Profitably? 15:52 — GPUs, Compute & Infrastructure: Shelf Life and Bottlenecks 24:23 — China, Open Source & the Global AI Race 32:46 — Policy & Regulation: State vs. Federal Dynamics 41:54 — AI Pricing Models: Usage-Based vs. Value-Based 47:10 — Open vs. Closed Models: Tradeoffs and Long-Term Winners 50:42 — Incumbents vs. Startups: Who Has the Advantage? 58:39 — a16z AMA: Disagree & Commit, Org Design, and Scaling Teams 1:08:44 — Jobs, Labor & How Society Adopts AI at Scale 1:15:50— Lightning Round: Rapid-Fire & Fun Questions Resources: Follow Marc Andreesen on X: https://twitter.com/pmarca Follow Jen Kha on X: https://twitter.com/jkhamehl Stay Updated: If you enjoyed this episode, be sure to like, subscribe, and share with your friends! Find a16z on X :https://twitter.com/a16z Find a16z on LinkedIn: https://www.linkedin.com/company/a16z Listen to the a16z Podcast on Spotify: https://open.spotify.com/show/5bC65RDvs3oxnLyqqvkUYX Listen to the a16z Podcast on Apple Podcasts: https://podcasts.apple.com/us/podcast/a16z-podcast/id842818711 Follow our host: https://twitter.com/eriktorenberg Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details, please see a16z.com/disclosures.

Marc AndreessenguestErik Torenberghost
Jan 7, 20261h 21mWatch on YouTube ↗

CHAPTERS

  1. Why this AI wave is historically different (and still early)

    Marc frames AI as the biggest technological revolution of his lifetime—larger than the internet—with comparisons to electricity and the microprocessor. He argues we’re only a few years into an 80-year arc of ideas (neural nets) finally working at scale, so today’s products are likely primitive compared to what’s coming.

  2. Revenue vs. burn: how AI companies can scale profitably

    The discussion tackles skepticism that AI revenue growth is offset by equally fast-growing costs. Marc splits the landscape into consumer and enterprise/infrastructure models, arguing both have unusually rapid adoption because distribution is piggybacking on the already-built internet.

  3. “Tokens by the drink” and AI’s deflationary cost curve

    Marc explains why usage-based pricing (tokens) has worked well for infrastructure providers and startups: low friction and scalable. He emphasizes that unit costs are collapsing faster than Moore’s Law, suggesting expanding demand and improving economics as compute becomes cheaper and more abundant.

  4. GPUs, chips, and the infrastructure bottleneck (and why it won’t last)

    The conversation turns to GPU longevity, data center buildout, and the chip roadmap. Marc argues that current reliance on GPUs is partly historical accident; purpose-built AI chips, hyperscaler silicon, and new competition should reshape pricing and supply in coming years.

  5. Big models vs. small models: the cascading “pyramid” structure

    Marc outlines a dual-track future: a few “God models” at the top, with a large volume of smaller models cascading down into devices and embedded systems. He highlights how small models often catch up to big-model capability with a time lag, changing deployment economics.

  6. US vs. China in AI: open source, chips, and the two-horse race

    The episode frames AI as a geopolitical and economic contest primarily between the US and China, with global proliferation at stake. Marc discusses China’s model ecosystem (DeepSeek, Qwen, Kimi/Moonshot, others), chip catch-up efforts, and how this competition reshapes Washington’s policy posture.

  7. Policy & regulation: federal vs. 50-state chaos

    Marc argues the federal outlook has improved because policymakers don’t want to handicap the US against China, but states are introducing a flood of bills. He explains why fragmented state regulation is mismatched to interstate AI markets and discusses attempts (so far unsuccessful) to preempt state action.

  8. What “draconian” AI bills look like: EU AI Act and California SB 1047

    Marc uses Europe’s AI Act as a cautionary tale, arguing it has slowed deployment and withheld features from European users. He then describes California’s SB 1047 (vetoed) and highlights provisions that would have imposed downstream liability on open-source developers, potentially freezing research and startup activity.

  9. AI pricing models: usage-based vs. value-based (the trillion-dollar question)

    Marc explains why tokens-by-usage is powerful for infrastructure and early-stage builders, but may not be optimal for applications. He emphasizes pricing discipline: avoid cost-plus pricing when possible, and instead price to capture a portion of the customer’s realized value (labor replacement or productivity uplift).

  10. Open vs. closed models: why both may win

    Marc argues the open/closed debate is still unresolved: proprietary labs keep advancing, while open source keeps rapidly matching capabilities and spreading know-how. He highlights education and skill diffusion as a strategic advantage of open models, accelerating the supply of AI talent and builders.

  11. Incumbents vs. startups: from “GPT wrappers” to full-stack AI companies

    The conversation reframes the startup debate: application-layer companies aren’t just thin wrappers if they orchestrate many models, customize stacks, and sometimes build their own models. Marc notes fast catch-up dynamics (xAI, and multiple China players) as evidence that leads may not be durable.

  12. a16z AMA: strategy, org design, and the cost of being outspoken

    Marc discusses how he and Ben operate—frequent debate but generally converging—and where tension does exist: the firm’s public footprint. He argues that clear, sometimes controversial public positioning attracts founders and educates policymakers, but creates real externalities that must be managed.

  13. Jobs, labor, and adoption: panic in polls vs. revealed preferences

    Marc places current AI job fears in a long historical pattern of technology panics, arguing society ultimately adapts. He emphasizes the divergence between what people say (surveys) and what they do (rapid adoption), predicting widespread normalization as AI proves practically valuable.

  14. Lightning round: beliefs, cryonics, reality distortion, and Mars

    In closing rapid-fire questions, Marc reflects on continually updating beliefs, skepticism about current cryonics, and practical checks on ego via market feedback and public criticism. He also downplays personal interest in going to Mars while expressing confidence that routine trips may become plausible.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome