a16zThe Future of Software Development - Vibe Coding, Prompt Engineering & AI Assistants
At a glance
WHAT IT’S REALLY ABOUT
AI models reshape infrastructure, developer tools, and software creation dynamics
- Infrastructure is defined as “the stuff you use to build the stuff,” bought by technical users (developers, IT, data scientists), spanning compute/networking/storage plus tools and now AI models.
- AI models function as a fourth infrastructure pillar because they impose new compute/data/latency requirements and—most importantly—shift software from programmer-specified logic to “abdicated logic” where systems ask models to produce answers.
- Past supercycles (cloud, mobile, COVID-driven PLG acceleration) show that lowering marginal costs expands TAM, creates new user behaviors, and opens whitespace for startups—dynamics the panel argues are happening again in AI.
- Defensibility in AI/infra is messy in the current “expansion/Brownian motion” phase, but the panel expects consolidation into durable oligopolies/monopolies by layer rather than total commoditization, with high switching costs persisting for infra.
- Near-term traction is strongest where error-correction loops exist (e.g., coding agents), while broader agents and synthetic data remain debated; “prompt engineering” is reframed as “context engineering,” creating new infra needs around data pipelines, observability, and guarantees.
IDEAS WORTH REMEMBERING
5 ideasAI models qualify as infrastructure because they change the programming model.
The panel’s test for “new infra” is whether it forces rethinking how software is built (latency, memory, data center design, chips) and how developers program; models meet that bar by introducing non-determinism and new interface patterns.
The biggest shift is “abdicating logic” from applications to models.
Historically, developers delegated resources (compute/storage) but not the yes/no logic; with LLMs, software increasingly asks the model to generate answers, pushing teams to redefine what programming and specification mean.
“Low-code” is arriving via natural language—expanding who can prototype software.
They argue AI finally fulfills the low-code promise by making natural language a practical interface for building and iterating, enabling more semi-technical users to prototype while still requiring professional rigor for reliable systems.
Don’t default to zero-sum thinking in infra during expansion phases.
In fast-growing supercycles, multiple layers can grow simultaneously (chips, clouds, models, apps); the panel expects later consolidation by layer into oligopolies/monopolies that still preserve margins rather than universal commoditization.
Infra defensibility often comes from integration and switching costs, not just “model uniqueness.”
Even “just an API” embeds logic throughout systems, making switching expensive; additionally, classic infra moats—deep expertise and years-long engineering (e.g., databases like DuckDB)—still matter in AI-era infra.
WORDS WORTH SAVING
5 quotesInfrastructure never goes away, it just gets layered.
— Jennifer Li
A new piece of infrastructure changes the way that you program computers and it changes the stack that's around it. We're building systems to build other systems
— Martin Casado
I don't remember ever in, like, the history of computer science where we've, like, from an application standpoint, we've abdicated logic.
— Martin Casado
One of the most exciting thing about the AI wave is like software's being disrupted, like we're being disrupted, right?
— Martin Casado
You can't anthropomorphize these models, right? A, a model is a file on a hard drive in a computer somewhere, and when you run a Python script, you can transform one piece of data into another piece of data.
— Matt Bornstein
High quality AI-generated summary created from speaker-labeled transcript.
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome