
Aidan Gomez: What No One Understands About Foundation Models | E1191
Aidan Gomez (guest), Harry Stebbings (host), Narrator
In this episode of The Twenty Minute VC, featuring Aidan Gomez and Harry Stebbings, Aidan Gomez: What No One Understands About Foundation Models | E1191 explores aidan Gomez Explains Foundation Models, Data, and AI’s Real Future Aidan Gomez, cofounder and CEO of Cohere and coauthor of the Transformer paper, discusses the economics, technical progress, and product landscape of large AI foundation models.
Aidan Gomez Explains Foundation Models, Data, and AI’s Real Future
Aidan Gomez, cofounder and CEO of Cohere and coauthor of the Transformer paper, discusses the economics, technical progress, and product landscape of large AI foundation models.
He argues that while scaling compute reliably improves models, the real frontier is data quality, new methods for reasoning, and efficient smaller models tailored to enterprises.
Gomez predicts a world of multiple horizontal and vertical models, falling inference costs, tight margins at the model layer, and major value capture at both the chip and application layers.
He is optimistic about AI’s role in productivity, agents, robotics, and copilots for workers, while dismissing doomsday scenarios and emphasizing trust, privacy, and deployment models for enterprises.
Key Takeaways
Scaling models with more compute works, but it’s inefficient and economically constrained.
Bigger models almost always perform better, but each incremental gain requires exponentially more compute and cost; this favors tech giants unless startups differentiate via data, algorithms, and efficiency.
Get the full analysis with uListen AI
High-quality and synthetic data are now the primary drivers of model improvement.
Open-source gains largely come from better data filtering, weighting, and synthetic generation; models are extremely sensitive to data quality, making curation and task-specific datasets a competitive edge.
Get the full analysis with uListen AI
We’re heading toward a multi-model world combining large general models and small specialized ones.
Teams prototype with powerful general models, then distill or fine-tune down to smaller, cheaper models optimized for specific tasks or verticals, creating an ecosystem rather than a single-model monopoly.
Get the full analysis with uListen AI
The model API layer is becoming commoditized, with margins squeezed by price cuts and open source.
With OpenAI price dumping and Meta releasing strong open models for free, selling “just models” will be a low-margin business; durable value is more likely at the chip and application/product layers.
Get the full analysis with uListen AI
Enterprise adoption hinges on trust, privacy, and deployment flexibility—not just raw capability.
Large customers resist training on their data and fear IP leakage, so vendors must support private deployments (e. ...
Get the full analysis with uListen AI
Hallucinations are falling and can be sharply reduced via RAG, but will never reach zero.
Enterprises overweight the risk of models being wrong compared to how often humans err; techniques like Retrieval-Augmented Generation dramatically lower hallucinations by grounding answers in citeable documents.
Get the full analysis with uListen AI
The near future will be defined by agents, robotics, and broad workplace augmentation, not replacement.
Gomez expects major advances in reasoning, planning, and robotics, enabling autonomous agents and humanoid robots, but believes overall effect will be productivity gains and new roles, not mass unemployment.
Get the full analysis with uListen AI
Notable Quotes
“There’s no market for last year’s model.”
— Aidan Gomez
“It’s definitely true that if you throw more compute at the model, if you make the model bigger, it’ll get better. It’s also the dumbest way to improve models.”
— Aidan Gomez
“Pretty much all of the major gains that we’ve seen in the open source space have come from data improvements.”
— Aidan Gomez
“If you’re only selling models, it’s going to be a really tricky game… it’s going to be like a zero-margin business.”
— Aidan Gomez
“You might want your children to be speaking to an extremely empathetic, extraordinarily intelligent and knowledgeable, safe intelligence that can teach them things and doesn’t get tired of them.”
— Aidan Gomez
Questions Answered in This Episode
How can a new startup realistically differentiate at the model layer in a world of falling prices and strong open-source models?
Aidan Gomez, cofounder and CEO of Cohere and coauthor of the Transformer paper, discusses the economics, technical progress, and product landscape of large AI foundation models.
Get the full analysis with uListen AI
What concrete techniques or workflows are most effective for generating high-value synthetic data without simply copying larger proprietary models?
He argues that while scaling compute reliably improves models, the real frontier is data quality, new methods for reasoning, and efficient smaller models tailored to enterprises.
Get the full analysis with uListen AI
How should an enterprise decide between building its own vertical model, fine-tuning an open model, or relying on a general-purpose API?
Gomez predicts a world of multiple horizontal and vertical models, falling inference costs, tight margins at the model layer, and major value capture at both the chip and application layers.
Get the full analysis with uListen AI
What specific reasoning and planning capabilities does Gomez expect to see in production models over the next 12–24 months, and how will they change products?
He is optimistic about AI’s role in productivity, agents, robotics, and copilots for workers, while dismissing doomsday scenarios and emphasizing trust, privacy, and deployment models for enterprises.
Get the full analysis with uListen AI
Where are the most promising opportunities for new consumer products that leverage agents and voice, beyond chat-style interfaces?
Get the full analysis with uListen AI
Transcript Preview
The reality of the matter is, there's no market for last year's model. It's definitely true that if you throw more compute at the model, if you make the model bigger, it'll get better. For folks who have a lot of money, that's a really compelling strategy. I think we'll continuously exist in a world of multiple models, some focused and verticalized, others completely horizontal. There's gonna be a, a consolidation in the space, for sure. It's really dangerous when you make yourself a subsidiary of your cloud provider.
Ready to go? (instrumental music plays) Aidan, I am so excited for this. So I was going through the prep, uh, first, before I was writing the schedule, and I was thinking, "Well, where do we start?" And then I saw one of the notes, and Aidan, it says that you grew up or were brought up in rural Ontario in a house your grandfather or father built by hand. What was that like as a starting point, and can you take me there?
Yeah. I, uh, I grew up in the middle of, you know, nowhere in Ontario. It was a big, uh, 100 acre lot which had a ... I- it's all forested, and it's a maple forest. And so it was super cool to grow up in, like, the most Canadian environment ever. But i- it was very distant from technology, for sure.
But you loved gaming, didn't you?
I did love gaming. I did love gaming. Um, so I love technology from, from scratch. It's just it was really hard to access it. Like, we couldn't get internet. Uh, we could do dial-up, but I had dial-up for years after people had gotten high-speed internet.
(laughs)
Um, and so all my friends, you know, they were online gaming, doing all this sort of stuff. And I was just so jealous. Or, or not jealous, but just, like, missing out on this wave of technology, the internet that was coming about and becoming popular. So it made me obsessed with tech. I would, like, sit at home with our computer with shitty, uh, dial-up internet, and I would just try to make it faster. I would try to make the most out of what I did have. And eventually, that led to wanting to learn how to code and understand how the web works and, you know, can I make this stuff faster? Can I load the internet faster? 'Cause I was watching pixels go line by line. And that's really what pushed me into CS, just sort of, like, forced to learn how this tech works so that I could get more out of it.
There's this weird understanding that I have now from meeting so many incredible founders, and it's this incredibly high correlation between those that gamed in their early years and those that achieved success. I just ... Why do you think gaming is such a contributor to successful founders?
Install uListen to search the full transcript and get AI-powered insights
Get Full TranscriptGet more from every podcast
AI summaries, searchable transcripts, and fact-checking. Free forever.
Add to Chrome