The Twenty Minute VCSam Altman, Arthur Mensch and more discuss:Which Startups Are Threatened vs Enabled by OpenAI?|E1156
EVERY SPOKEN WORD
40 min read · 7,600 words- 0:00 – 15:00
There was a time…
- SASam Altman
There was a time when there were, like, more than a hundred car companies in the US, I believe, or at least close to that. And if you go, like, look at some of the old media at the time, it was like, "No, there's this better car. No, there's this better one. No, there's this better one." I think that same thing holds true for most new industries. I think it's fine. I mean, I think it's probably good. But I don't think that's where the enduring value will be. I think eventually it will shake out. There will be a small number of providers, just dozen, something like that, doing models at big scale and it'll be extremely complex, extremely expensive. And I hope we all continue to push each other to make the models better, cheaper, faster and commoditize in that sense. And the long-term differentiation will not be, I don't think, the base model. Like, that's just, you know, intelligence is just like some emergent property of matter or something. The long-term differentiation will be the model that's most personalized to you that has your whole life context that plugs into everything else you want to do, that's, like, well integrated into your life. But for now, the curve is just so steep that the right thing for us to focus on is just make that base model better and better.
- HSHarry Stebbings
Arthur at Mistral. I'm so intrigued to hear your thoughts. How do you think about the focus on just model improvement where value accrues, and ultimately of the foundation models commoditizing in themselves?
- AMArthur Mensch
It's two opposing directions. So the first is that the models are getting better and better. So it means that creating a verticalized application, as long as you have the data for it and a good understanding of the use case you're facing, is going to be easier and easier if you have access to the tools that facilitate it. Uh, so that's the first aspect, which would make me think that the, the application layer is going to grow thinner and thinner. But then there's also the fact that the models are, are getting, uh, cheaper and cheaper because we manage to compress them, because we make a lot of improvement on their efficiency. And so that means that effectively this press the competitive pressure there is on the model layer, means that, uh, the price around the model, the dollar per intelligence unit, let's say, is definitely going to reduce. So there's these two aspects of growing ability, compressed price, which on one side says that the application layer is going to grow thin, and on the other side says that the model part is going to grow thin. So for us, the, the approach that we are taking is that the model part is still going to be big enough and that we need to build this platform on top of that, uh, because that's where we are going to enable all of the vertical applications that will be interesting for humanity.
- HSHarry Stebbings
Tom Hume at GV, I'd love to hear your thoughts. How do you think about that, the commoditization of the models, and whether there's money to be made investing in the foundation model themselves or actually in the application layer beneath it?
- THTom Hulme
My first observation would be the technology is commoditizing incredibly quickly, which worries me a lot. So I think I likened when we talked the other day it to investing a few hundred million into a power station. That's the training time and then you can turn it on and you've got inference coming out the side, that's your power. Now the problem is this is an industry where it's gonna take you a few months to build your power station and everyone else is building similar power stations next door with relatively little edge. They're still, they're all using the same GPUs, they're marginal improvements, but you basically got to depreciate that asset in these foundation models over a few months. I just can't see it happening. And then now we've got Meta coming into the market, I mean, Zuckerberg's done an amazing job. You'll have 350,000 H100s by the end of this year, that is 14% of the world's H100s and he's gonna open source the result. Llama 3 released last week is already incredible. He's pledged that he's gonna invest another $100 billion or so. He's already started to train Llama 4, that team, and they're world class. They're formidable competitors. So to invest now in an asset that you think you're gonna have to depreciate, you know, over the space of weeks or months is very difficult to do. Now we have made investments in GenAI, but more in infrastructure, more in the application layer, more in the sort of picks and shovels to support, but we've not thus far invested in foundation models.
- HSHarry Stebbings
Is there money to be made investing in foundation models? When you look at the quantum of capital that is required to go in, there's obviously rumors of Mistral's new funding round. You see the amount of cash that's gone into OpenAI and everyone else, there's a dilution inherent within that. It's just gonna be monstrous. Is there money to be made investing in foundation models, do you think?
- THTom Hulme
There definitely has been because if you were to invest in OpenAI in the $10 billion round, there's liquidity in the market. You could sell that for a five X now and you could have done that over a year. So if you've got a momentum strategy and you believe that firm that you're investing in is gonna be at the front of the pack and continue to be, I suspect there's money to be made. But if you're investing in fundamentals, it's very difficult to invest in something that actually is gonna commoditize that quickly. In fact, I'd say the best teacher I ever had was Clay Christensen, just unbelievably smart human being. He wrote The Innovator's Solution.
- HSHarry Stebbings
Oh, yeah.
- THTom Hulme
We all know that. And he will talk about sus- or he did talk about sustaining and disruptive innovations. I think one of the frustrations with GenAI as the technology is commoditizing so quickly is it's a sustaining innovation. It's actually gonna get sprinkled across all businesses to lower costs in call centers or to improve the product in personalization. It's not gonna...... have a creative destruction effect like the internet did on many industries. And so as an investor, that's frustrating because you want to invest in stuff (laughs) that persists and completely rebuilds industries from scratch. But I can't really see it. I mean, we found some targets and we made quite a few investments, but it's not, for me, the sort of radical sort of shift or opportunity, from an investment perspective, that we perhaps saw with the internet.
- HSHarry Stebbings
What do you think the end state then is for models? You know, I was with a friend, who will remain nameless 'cause he hates being publicly named anywhere, and he mentioned that, bluntly, cloud providers will be the cash cow business, and they will buy your Googles, your Amazons, and your Microsofts, will basically buy the foundation model companies, acquihire them about inflection, and then have cash cow businesses in the cloud providers, and then give away the foundation models for free.
- THTom Hulme
Yeah. That's, to date been, my thesis as well. It will look more like a utility, and the cloud providers rationally are saying, "We wanna provide that utility on our compute." And they're gonna charge on that basis. And they already are, whether you're on AWS, GCP, anywhere else.
- HSHarry Stebbings
Des Traynor at Intercom, I'd love to hear your thoughts. You use OpenAI to power Fin in many respects. Where do you think that the value is going today? How do you think about the commoditization of these models? I'd love to hear your thoughts.
- DTDes Traynor
Right now, a lot of value is going straight into the infra, like as in we're handing it all at the back door to OpenAI and, and...
- HSHarry Stebbings
Yeah.
- DTDes Traynor
We actually torture test all of the LLMs. It's not yet the case that they're all equal. I'd be wary of Amazon. So, like, I could see Amazon just, like, flat out, like, buying Anthropic and being like, "Let's just make this part of the EC2 cluster." I think Apple will make massive, massive strides forward with AI. I feel like Bard, unfortunately, felt like we had to release this because ChatGPT was getting a lot of traction. They need to have that Jay-Z-like, "allow me to reintroduce myself" moment. But I think it's a really important attribute to be able to transition from one LLM to another because if somebody does unlock new power, you'll want to be able to use it very quickly. Winning involves more than just simply being agnostic about your LLM.
- HSHarry Stebbings
Speaking of being agnostic about your LLM, would you invest in OpenAI at $90 billion then, Des?
- DTDes Traynor
I don't think I would. And the reason, whilst I think they'll pass 90, the areas I'd be wary of is Amazon. Amazon played this game well. So like I can, I could see Amazon just like flat out like buying Anthropic and being like, "Let's just make this part of the EC2 cluster." And that's just a very easy route to market. And I think if OpenAI run out of new vectors of differentiation and the commoditization starts to kick in, even for basic stuff, I think it'll just become an easier, "Why wouldn't you just use Amazon? It's already in the cloud. It's already virtually private." It's, you know, you can, like, leapfrog a lot of other adoption concerns. So I think they're one risk.
- HSHarry Stebbings
Tom Hume, my friend. What about you? Would you invest in OpenAI at $90 billion valuation?
- THTom Hulme
I would struggle to make that investment today. And it's not because I don't respect the team, my biggest concern at the moment. But if I observe the emergence of what Meta is doing, if I look at the arms race of what the cloud providers are investing in and the sort of Gemini, et cetera, any advantage is pretty ephemeral. And the consumer-facing product that doesn't, that drives, I don't know, is it 50% of the revenue? Something like that, is not sticky. So to invest in a foundation model, what would I want to be true? I would want to believe that they had some unique approach that made them more defensible. So an obvious one is memory. Like, actually, none of these have cracked memory yet. But if you have a personal assistant, a ChatGPT equivalent, and it m- it remembers so that it can actually be, apply probabilities as to what you want going forward, then it's interesting. If it's unique in its ability to take agency, then it might be interesting. There's other orthogonal approaches that might be interesting. But if we're just talking about a foundation model where you're gonna throw huge amounts of data, hundreds of millions of dollars of compute at H100s like everyone else, it's very difficult to see a return on these investments.
- HSHarry Stebbings
Tom Tunguz, I know you've done some work around the analysis of where value was created in terms of application versus infrastructure layer for the cloud generation. I'd love to hear how you think about this moving forwards in the next few years with foundational models versus application layer, and what the analysis from the last generation could tell us about the next generation.
- TTTomasz Tunguz
I ran this analysis. So in Web2.0, if you take the top three clouds and you look at their market cap, so AWS, GCP, and Azure, it's about a $2.1 trillion market cap just for the cloud businesses. And then if you take the top 100 publicly traded cloud companies, both on B2C and B2B sides, so Netflix and ServiceNow, they have equivalent market cap, about 2.1 trillion for both. So one is at the infrastructure layer, one is at the application layer. Market cap is basically equivalent. The difference is, the infrastructure layer there are three businesses, and at the application layer there are 100. If the analogy holds, as an investor, the odds of success (laughs) are going to be significantly higher at the application layer because the diversity of needs there is greater.
- HSHarry Stebbings
Emad, formerly of Stability, what do you think? How do we see the end state for foundational model companies? What does that look like? Are there going to be many, many? Is it going to be a concentrated set of fewer players? How do you think about that?
- EMEmad Mostaque
I think that there's only going to be five or six foundation model companies in the world in three years, five years. I think it's going to be us, NVIDIA, Google, Microsoft, OpenAI, and, uh, Meta and Apple probably are the ones that train these models.
- HSHarry Stebbings
Is Anthropic good?
- EMEmad Mostaque
Anthropic are great. But from a business model perspective, you have Claude on Google API-
- HSHarry Stebbings
Yeah.
- EMEmad Mostaque
... and you have PaLM 2. How are they going to keep up with PaLM 2? They can raise billions, but Google spend $20 billion a year on AI. DeepMind's salary budget is 1.2 billion a year. They technically make a billion a year from their internal counter payments with Google as well. But again, Google, how much money do they have? $150 billion to win this.
- HSHarry Stebbings
Okay. So we spent a good amount of time now trying to understand whether the model landscape will commoditize and where true value is built in that segment of the market. When we think about the consumer or the application layer, so to speak, the application layer that sits on top of these models, I'd love to start with Sam Altman at OpenAI, who, who better, uh, in understanding how do we think about where value lies in terms of infra versus application layer and the different strategies to build on top of AI right now?
- SASam Altman
There are two strategies to build on AI right now.
- BLBrad Lightcap
There's one strategy which is assume the model is not gonna get better, and then you kind of, like, build all these little things on top of it. There's another strategy which is build assuming that OpenAI is going to stay on the same rate of trajectory and the models are going to keep getting better at the same pace. It would seem to me that 95% of the world should be betting on the latter category, but a lot of the startups have been built in the former category. When we just do our fundamental job, we're gonna steamroll you.
- 15:00 – 18:19
Tom, it's so interesting…
- TBTom Blomfield
uh, sort of work and behave and act, and tailor your software to fit into that industry in a way that's extremely deeply embedded. Most people building application layer stuff in an AI, say it's 80 to 90% traditional software with 10% AI. It's working in struc- in construction, figuring out how Procore works or, you know, how the Salesforce CRM works or some Oracle database. And, like, I don't think OpenAI is going to come and steamroll the construction company AI companies because they're not going to deeply integrate into the processes and software that exists in each of those industries. So, I really believe everyone who works with a computer will have an AI copilot assistant thing in the next two or three years, whether you're, like, a, an oncologist or a law professor.
- HSHarry Stebbings
Tom, it's so interesting you say there about the copilot strategy. It's the strategy that I see as the entry wedge for all startups today when I'm investing. Miles Grimshaw at Thrive, I know you have some thoughts on the copilot strategy. So, hit me, how do you think about the copilot strategy today for startups and how do you feel about it?
- GUGuest
I think copilot is an incumbent's strategy. Incumbents own distribution, they own data, they own the UX, and they own a business model that all aligns to a copilot. Copilot as GitHub Copilot, right? Like, inline suggestions. Think of it like how most... Go, go to any Microsoft product right now. Every Microsoft product now has a copilot experience. They just have a sidebar, an autofill, things like that, right? Where the UX, th- the core product is a layer on top of it, right? It's immediately added in, which is also totally a incumbent strategy. And it's still about sort of supercharging that worker, but still where every user has a seat and every user is doing most of the work. And it works probably, you know, if you think about the evolution here, the models, most of what's rolled out might not be good enough for some of this yet, right? But that's what will come around the corner. You know, if you think back to Salesforce disrupting c- Siebel, Salesforce launched, like, five years after Netscape launched. Like, it might take a moment for that to happen, but the copilot, this idea of I'm still the pilot, I'm still the user controlling everything and it's sort of, like, giving me assistive suggestions like GitHub Copilot fits into the UX of incumbents. It fits into the business model of incumbents, and they already control all that distribution. The opportunity offered up to a startup, being a copilot for something else, like, probably won't be that amazing. And there might be pockets of it where it can really work, but the opportunity to disrupt is to be orthogonal to the incumbents.
- HSHarry Stebbings
It's so interesting to hear you say about kind of being orthogonal to the incumbents. My next question from that is, well, how does that then change the business model and the pricing model that SaaS providers use when thinking about selling to their customers? And Sarah Tavel, I'd love to hear your thoughts on this because I know you have a different take on this that I love. So, how do you think about that?
- STSarah Tavel
What AI enables is actually a very different unit of work that you sell, which is doing the work. And so you're almost a software company that looks like a services business that is able to sell, like, the full work product, the, uh, the outcome, as opposed to selling software that an employee has to learn to use and then gets a productivity boost from. And this is very disruptive to incumbents because incumbents are used to thinking about selling per seat and pricing per seat based on the cost of the headcount. But if instead, you're selling something that doesn't require a seat, that is, like, a very disruptive opportunity for startups.
Episode duration: 18:19
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode vwaWFU47ZNw
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome