Y CombinatorWhy the AI Bubble Misses Where Startups Actually Win
With models swapping in and out, the startup edge lies in the application layer; vibe coding and infrastructure bets are the actual durable advantages.
EVERY SPOKEN WORD
60 min read · 12,246 words- 0:00 – 0:50
Intro
- MSMichael Seibel
I think perhaps the thing that most surprised me is the extent to which I feel like the AI economy stabilized. We have, like, the model layer companies and the application layer companies and the infrastructure layer companies. And it seems like everyone is going to make a lot of money, and there's kind of, like, a relative playbook for how to build an AI-native company on top of the models.
- HTHarj Taggar
Many episodes ago, we talked about how it was ... felt easier than ever to pivot and find a startup idea, 'cause if you could just survive, maybe if you could just wait a few months, there was likely going to be some, like, big announcement that would completely make a new set of ideas possible. And so, like, finding ideas is sort of returning to sort of normal levels of difficulty.
- GTGarry Tan
Welcome back to another episode of The Light Cone. Today, we're talking about the most surprising things that we saw this year in 2025.
- 0:50 – 5:36
Anthropic models are most preferred in the latest YC batch
- GTGarry Tan
Diana, you found a pretty crazy one. It's sort of a changing of the guard almost in who is the preferred LLM at YC during the YC batch.
- DHDiana Hu
Yes. In fact, we just wrapped up the Winter '26 selection cycle for companies, and one of the questions we asked to all the founders that apply to YC is, "What is your tech stack and model of choice?" And one of the shocking things is that, for the longest time, OpenAI was the clear winner. For all of last year, last couple of batches. Though, that number has been coming down. And shockingly, in this batch, the number one API is actually Anthropic came out a bit more than OpenAI-
- GTGarry Tan
Yeah.
- DHDiana Hu
... which who would have thought? I think when we started this podcast series back then, OpenAI was like 90-plus percent, and now Anthropic. Who would have thought?
- GTGarry Tan
Yeah. And you know, they've been hovering around, like, 20, 25% for most of, like, 202- 2024 and early 2025. And then only even in the last three to six months did this sort of changing of the guard actually happen.
- DHDiana Hu
They had that, this, uh, hockey stick with the, with the growth over 52%.
- GTGarry Tan
Why do you think that is?
- DHDiana Hu
I think there's a couple of things in terms of the tech stack selection. I think as we've seen this year, there's been a lot of wins in terms of vibe coding tools that are getting built out out there, and coding agents. There's so many categories that this ended up being a bigger problem space that actually is creating a lot of value. And it turns out the model that performs the best at it is actually, uh, the models from Anthropic. And I think that's not by accident. I think from the hearing the conversation we had with Tom Brown not too long ago, he came and spoke, is that was one of their internal evals. They on purpose made them their North Star, and you can see it in the model taste as a result of what's the best choice of model for a lot of founders building products is Anthropic.
- MSMichael Seibel
The vast majority of the use cases people are using it for, though, is not coding. So I wonder if there's, like, a bleed through effect where people are using Claude for their personal coding, and then as a result, they're more likely to choose it for their application, even if their application is not doing coding at all.
- GTGarry Tan
'Cause you'd be very, um, familiar-
- MSMichael Seibel
Yeah.
- GTGarry Tan
... with, like, the personality of Claude Opus-
- MSMichael Seibel
Yeah.
- GTGarry Tan
... or whatever they're choosing.
- MSMichael Seibel
Yeah.
- GTGarry Tan
Sonnet, I suppose.
- MSMichael Seibel
How about Gemini? How's Gemini doing in those rankings?
- DHDiana Hu
Gemini's also pretty much has been climbing up pretty, pretty high. I think last year was probably single digit percent, or even like 2 or 3%, and, uh, now for Winter '26 is about 23%. And, uh, we've personally been using also a lot of Gemini 3.0, and we've been impressed with the, with the quality of it. I think is really, really working.
- MSMichael Seibel
I mean, they have all different personalities, don't they?
- DHDiana Hu
That too. (laughs)
- MSMichael Seibel
Yeah.
- DHDiana Hu
It's, uh, it's kind of the classic where OpenAI sort of has the, uh, black cat energy.
- MSMichael Seibel
(laughs)
- DHDiana Hu
And almost like, uh, Anthropic is kind of more the happy-go-lucky, a bit more very helpful golden retriever. At least that's what I feel when I talk to them.
- MSMichael Seibel
Mm-hmm. And how about Gemini?
- DHDiana Hu
It's kind of, like, in between.
- MSMichael Seibel
Harj, you prefer Gemini actually.
- HTHarj Taggar
Yeah. I switched to Gemini this year as my just go-to model. I think even before 2.5 Pro came out and just seemed better at reasoning for me, it was just like the ... Increasingly, I replaced my Google searches with Gemini, and I just sort of trusted that Google's ... I think, like, the groundings API and its ability to actually, like, use the Google index to give you, like, real time information correctly, I just found it was better than... I- I ... Personally, I found it was better than all the other tools for that, and it was better than Perplexity on it too. Like, Perplexity would be fast but not always accurate. And Gemini was not quite as fast as Perplexity, but was always pretty accurate if I asked it about something that happened today, for example.
- GTGarry Tan
Even if you use Gemini as the reasoning engine in Perplexity?
- HTHarj Taggar
I have not done that. (laughs)
- 5:36 – 7:01
Why aren’t there more AI consumer apps?
- HTHarj Taggar
still surprised about is why there just aren't more, um, consumer apps f- around, like, all the various things we do. Like if I think back, one of the big changes for me this year is just the amount of prompting and context engineering I do for, like-... my life. Like, I, we, we bought a house recently, and like, the whole thing, like, I just had, like, a really long running ChatGPT conversation, stuffing it full of context of, like, every inspection report or, like, wanting it to be, like, l- level the playing field between me and, like, the realtor to understand kind of all the dynamics and things that are going on. And it just feels like there should be an app for that.
- GTGarry Tan
But simultaneously, I'm sure you took the, uh, PDFs and just, like, dropped them into Gemini and said, like, "Well, summarize and tell me what's important for me."
- HTHarj Taggar
I guess I worry about... I worried about... I still don't trust the models enough to be accurate without lots of prompting, and it's a high value transaction, so you don't want to, like, get incorrect data out of it. So, I still feel like you need to put in the work, and it feels like there should still be apps that just do all the work for you.
- GTGarry Tan
Did you see Karpathy release, like, sort of a LLM arena of a sort? Which, I mean, I do by, like, hand right now using tabs. It's like, you have, uh, Claude open, you have Gemini open, you have ChatGPT open, and you give it the same task, and then you take the output from each, and then I usually go to Claude at that point and I'm like, "All right, Claude, this is what the other one said. What do you think?" And check each other's work. (laughs)
- DHDiana Hu
I sure think
- 7:01 – 9:08
Swapping models in and out is becoming the norm
- DHDiana Hu
that that particular behavior at the consumer that, level that we're doing, startups are doing as well. They are actually arbitraging a lot of the models. I had some conversations with a number of founders, where before they might have been loyalists to, let's say, OpenAI models or Anthropic, and, uh, just had some conversations recently with them, and these are founders that are running larger companies, like series B level type of companies with AI. They're actually abstracting all that away and building this orchestration layer, where perhaps as each new model release comes out, they can swap them in and out, or they can use specific models that are better at certain things for just that. For example, I heard from this startup, they use Gemini 3 to do the context engineering, which they actually then fed into OpenAI to execute it. And they keep swapping it as new models come up and the winner for each category or type of agent work is different, and ultimately, they can do this because it, it is all grounded based on the evals. And the evals are all proprietary to them because they, they're a vertical AI agent and they just work in a very regulated industry and they have this dataset that just works the best for them. I think this is the new normal right now, where people are expecting, yeah, the... It's cool that the model companies, they're spending all this money and making intelligence faster and better and we can all benefit. Let's just do the best. It's almost like the era of, um, Intel and AMD with new architecture would come up. People could just swap them, right?
- HTHarj Taggar
Yeah, it feels like at the highest level, that angst around where's the value going to accrue, is it gonna go to the model companies or, like, the application layer, i.e. the startups, feels like that ebbs and flows in either direction a little bit throughout the year to me. Like, I feel there are moments where, like, y- Claude code, amazing launch, and then it was like, oh, okay, like, the model companies are actually going to play at the application layer. But then, to me at least, it was all vibes based. Like, the Gemini surge, especially over the last few months, just feels like it returns us to a world of where... Exactly that. Like, the models are all essentially commoditizing each other and it's just, like, the application layer and the startups are gonna, are set up to have another fantastic year if that continues.
- 9:08 – 14:37
The big AI bubble question
- HTHarj Taggar
- DHDiana Hu
I'm curious what you think, Jared, with some... a lot of, uh, perhaps the negative comments on Twitter around, is this a bit of a bubble? A AI bubble?
- HTHarj Taggar
(laughs)
- MSMichael Seibel
Yeah. When I talk to undergrads, this is, like, a common question that I get is like, "Oh," like, "I heard it's a big AI bubble because, like, there's all this, like, crazy round tripping going between NVIDIA and OpenAI," and, like-
- GTGarry Tan
No, this is great for you.
- MSMichael Seibel
... "is it, is it, is it, is it, is, is it all fake?" (laughs) Yeah.
- GTGarry Tan
No, this is fantastic, right? Like, people look at the telecom bubble and it's like, there's just, you know, billions of dollars, like, tens of billions, hundreds of billions just, like, sort of sitting in a bunch of telecom back in like the, you know, 90s. Actually, that's why YouTube was able to exist, right? Like, if you just have a whole bunch of extra bandwidth that isn't being used and it's relatively cheap, the cost is low enough for, like, something like YouTube to exist. Like, if there wasn't a glut of telecom, then, like, maybe YouTube would have happened, it just would have happened later. And then that... Isn't that, like, sort of what we're talking about here? Like, how do we... We have to accelerate, right? We have the age of intelligence, the rocks can talk, they can think and they can do work, and you just have to zap them more (laughs) and you get, like, smarter and smarter stuff at this point. I think the argument to college students is actually, like, because there will be a glut, there is an opportunity for you, and if there was not a glut, then there wouldn't be as much competition, the prices would be higher, the margins lower in the stack would be higher, right? And then, you know, what's one of the big stories this year? Like, NVIDIA suddenly is on the outs. Like, I think their stock is, today, is like around 170s or something. You know, I think I'm still a long term buy and hold honestly, but for the moment people are like, "Oh, well Gemini's so good and all the..." You know, nobody seems to be NVIDIA only now and everyone's buying AMD and everyone's bu-... You know, and TPUs are working, so, you know, at the moment, it looks like there's... You know, what does that mean? Like, there's competition, and, uh, it means that there will be more compute, not less, and then that means that probably a little bit better things for all of the big LLM companies, like sort of the, you know, the AI labs. Uh, they get a little bit of power, but, you know, they too are in competition with one another. So then what does that mean? Well, then it's, you know, go up another level in the stack, right? Like, as long as there are a great many AI labs that are in a deep competition with one another, then, uh, that's even better for that college student who's about to start a company at the application level.
- MSMichael Seibel
Yeah, I think that's exactly right. It's like p- people are asking this question, like, "Is it a bubble?" That's maybe a question that's really relevant if you're like the equivalent of, like, Comcast. Like, if you're NVIDIA, that's a very relevant question. Like, "Oh, are people overbuilding GPU capacity?" But like, the college students, they're not Comcast. They're actually like YouTube. If you're doing a startup in, in your dorm room, it's like the AI equivalent of, like, YouTube and like, kind of doesn't really matter that much. Maybe NVIDIA's stock will go down next year, I don't know. But like even if it does, that doesn't actually mean that it's like a bad time to be working on an AI startup.
- HTHarj Taggar
Yeah, it's what Zuck said on a podcast earlier this year, I think, right? It's like, Meta may end up overinvesting, like, a significant amount in, like, the CapEx and infrastructure, but, like, they essentially have to... The big companies have to do it 'cause they can't just, like, sit on the sidelines, and in the case, like...... demand falls off a cliff for some reason-
- DHDiana Hu
Mm-hmm.
- HTHarj Taggar
... it's their cap ex, not the startup's cap ex. And there's still going to be tons of infrastructure and ideas to still continue building.
- DHDiana Hu
There was this book written by this economist called Carlota Perez, who studied a lot of, uh, tech trends and it studies a lot of, um, technology revolutions. And it summarizes as there's really two phases. There's the phase of, uh, installation, which is where a lot of the very heavy cap ex investment come in, and then there's the deployment phase, where it really ripples, it- it- where it rips, and then everything explodes in terms of abundance. And during the initial phase of installation is where it feels like a bubble. There's a bit of a frenzy because it starts first with a, "There's this new technology that's amazing," which happened with the ChatGPT moment in 2023. Everyone got super excited about the tech and then everyone s- got super hyped and got into investing into a lot of the infrastructure with buying a lot of GPUs and all the giant gigawatt data center built out. And then people say, "But what is the demand? What are going to be all the applications to be built out?" I think right now we're in that transition, which is actually really good news for startup founders because they are not involved into the building the data centers, but they're going to build the next generation of applications in the deployment phase when it really proliferates. And what happened, just going back to the analogy with- with the era of, uh, the internet, before the 2000, there was a lot of heavy cap ex investment into the telcos, right? Those were giant projects that college students wouldn't be involved, but they were very heavily invested and- and in some cases were overinvested. I mean, there's a whole thing with dark fiber and some pipes that are not used, and that's fine. The internet ended up being still a giant economic driver. And what that means is startups like the future Facebook or the future Google are yet to be started because those come in in the deployment phase, because right now think these things- things are still getting built out. I- I do think the foundation lab companies and GPUs very much are falling into the bucket of infrastructure.
- GTGarry Tan
Yeah. I mean, it's interesting to watch, uh, how this stuff is evolving a little bit.
- 14:37 – 17:29
Space as the solve for data centers and energy
- GTGarry Tan
So do you remember summer '24, there was a company called StarCloud that came out and was one of the first to come out and say, "We're going to make data centers in space." And what was the reaction when, you know, people saw that?
- DHDiana Hu
People laughed at them.
- GTGarry Tan
Yeah.
- DHDiana Hu
On the internet, yes.
- GTGarry Tan
Right.
- DHDiana Hu
They said, "That's the stupidest idea ever."
- GTGarry Tan
You know, I guess 18 months later, uh, suddenly Google's doing it-
- DHDiana Hu
(laughs)
- GTGarry Tan
... Elon's doing it.
- HTHarj Taggar
Sunil Mitchell's in every interview now apparently. (laughs)
- GTGarry Tan
Is that right?
- HTHarj Taggar
Yeah. It seems to be, like, his top talking point.
- GTGarry Tan
Yeah.
- DHDiana Hu
Yeah.
- GTGarry Tan
And so, I mean, why is that? Like, I feel like one of the aspects is that, like, part of the, um, infrastructure build out right now that's so intense is, like, we literally don't have power generation.
- DHDiana Hu
Yeah.
- GTGarry Tan
Boom Supersonic, instead of making supersonic jets right now is on this good quest to create enough power for a bunch of these AI data centers that are being built right now. They use jet engines (laughs) and even those, like, are so ba- you know, the supply chain for jet engines to generate power for data centers is so backed up that, you know, you would have had to have ordered these things, you know, two or three years ago just to even have it in two or three years from now.
- DHDiana Hu
(laughs)
- GTGarry Tan
You know, these constraints, uh, end up, like, influencing, like, fairly directly what the giant tech companies need to do to win the game three or five years out. Like, suddenly there's not enough land. You know, in America, we can't build. The regulations are too high. In California, we have CEQA which is totally abused by the environmental lobby to stop all innovation and building housing by the way. We just don't have enough terrestrially, like, to just do the things that society sort of needs right now. So you know, the escape valve is like, "Actually, let's just do it in space." (laughs)
- DHDiana Hu
Yeah, come to think about it, we- we kind of have the trifecta of YC companies that are solving the data center build out problem. We-
- GTGarry Tan
Well, you need fusion energy, so.
- DHDiana Hu
Yeah. Yeah, well, we- h- have the company that's solving the no land problem by building the data centers in space. We have Boom and Helion, which are solving the "we don't have enough energy" problem. We actually-
- GTGarry Tan
I just funded a, uh, a space fusion company that just graduated called Zephyr Fusion.
- DHDiana Hu
Oh yes, that's a cool one.
- GTGarry Tan
And they actually had a great seed round out of demo day. They're in their 40s, they're national lab engineers who their entire careers they were building, you know, tokamaks and fusion energy. And they came into the lab one day, they looked at the physics, they, you know, looked at the math and the models and they said, "You know what? If we did this in space, it would actually pencil." And so that's- they're on, like, this sort of grand next five, ten year quest to actually manifest it, to actually create it in space, um, because the equations say that it is possible. And, uh, if they do it, it's actually the only path to gigawatts of energy, uh, up there in space. So you know, it might be, you know, an even more perfect trifecta, uh, shortly.
- HTHarj Taggar
Something
- 17:29 – 21:01
Interest in starting model companies
- HTHarj Taggar
else I feel like happened over the course of this year is the, um, interest in starting model companies. Like, I guess at maybe at both ends there's, like, the people who can raise the capital to go and actually try and build a head-on competitor to OpenAI, which there are very few. Like, maybe you have Ilia with SSI, but then more so within YC, people trying to build, like, smaller models. Um, I've certainly had more of those in the last few batches than before. Like, whether it's sort of like a models to run on edge devices or maybe, like, a voice model specific to a particular language. And I'm curious to see if that trend continues going back to this early era of YC actually. We sort of saw the explosion of just startups being created and maybe especially SaaS startups. Partly what- what, um, fed that was just knowledge about startups became more dispersed. There wasn't the canon of library information on the internet about like how to start a startup, how to build software, and then over, like, a decade that just became more commonplace and that just exploded, like, society's knowledge of startups and how to build things. And it's may- feels like maybe we're going through that moment in sort of the AI...... research meets, like, actually building things.
- GTGarry Tan
With, with training models.
- HTHarj Taggar
Yeah.
- GTGarry Tan
I think we are actually going through that right now, yes-
- HTHarj Taggar
Like that-
- GTGarry Tan
... where, where it's going from being a s- a very rare skill set to a more common one, yeah.
- HTHarj Taggar
'Cause, like, OpenAI a decade ago was, like, a rare ... Like, you needed, you need, like, a, a, a unique combination of skills, right? You need, like, your researcher brain, your sort of, like, engineering brain, maybe like your sort of finance business brain, or-
- GTGarry Tan
Wait, wait, wait. So, did you just describe Ilya, Greg, and Sam?
- HTHarj Taggar
You got it. (laughs)
- GTGarry Tan
(laughs)
- DHDiana Hu
(laughs)
- HTHarj Taggar
That was, like, a rare team, right? There was just wasn't that configuration of team around very much. And now, a decade later, there's, like, a plethora of people who have, like, i- the research background, the engineering background, um, y- the startup capital raising, um, background, or at least can be taught how to do all of that kind of stuff. And I'm curious if that will just mean we'll just see more applied AI company starting, and maybe there'll be, like, even more models to choose from for all the various specific tasks.
- DHDiana Hu
I think so. I think the other thing that's even contributing and making this an even bigger snowball is because of RL. I think there's all these new open source models that people are doing the fine-tune on top of it with a particular RL environment and task. So, it is very possible that you can create the best domain-specific, let's say, healthcare model trained on a generic open source model by just doing fine-tuning on it and doing RL. It beats the regular big model. Actually, I've heard and seen a number of startups where their domain-specific model beats, uh, OpenAI, let's say, on healthcare. There's this particular YC startup that told me that they collected the best data set for, for healthcare, and they ended up performing better than OpenAI in a lot of the benchmarks for, for healthcare with only eight billion parameters.
- GTGarry Tan
I guess what's funny is, uh, you do need to have a post-training infrastructure. Uh, you know, we've also had YC companies where, uh, they had something that beat OpenAI, uh, you know, GPT-3.5, and they were doing fine-tuning with RL. But then, uh, yeah, G- GPT-4.5 and then 5.1 came out and, uh, you know, basically blew their fine-tuning out of the water. (laughs)
- DHDiana Hu
Have to keep going, yeah.
- GTGarry Tan
Yeah. You got to keep going. Yeah, I mean, you actually have to continue to, uh, get to the edge. Anything else, uh, that really sort of stood out from this past year that jumps out to you?
- 21:01 – 22:23
Vibe coding became a big category
- DHDiana Hu
Uh, it's funny. We started the year with one of our episodes that got a lot of views around vibe coding. I think we were talking about it more as observing a behavior that was happening from our founders. And I was surprised to see that this became, like, a giant category. There's lots of companies that are winning. I mean, we have Replit, there's Emergence, there's a bunch of them.
- GTGarry Tan
Varun Mohan had gone over to Google. He released Antigravity, and, uh ... Did you guys see the video? I actually am sort of curious whether they actually used NanoBanana or any of these video gen things 'cause it's, like, a little too perfect. But Google has the budget to do the high production value video, but it's, you know, Varun at the keyboard, and then, you know, Sergey is, like, right behind him. So, I was like, "This is very cinematic." Anyway, I think Sundar was, uh, you know, also not only talking about, um, space-
- HTHarj Taggar
Data centers and space.
- GTGarry Tan
... data centers. Uh, he was also talking about vibe coding. And I knew that I was a little bit trolling back, but knowing what we know, I mean, yes, vibe coding is not y- you know, completely usable and trustable for, uh, you know, 100% of your coding, period. Like, this, you know ... It is not true that you can, like, ship 100, 100% solid production code today as of 202- and the end of 2025.
- 22:23 – 23:38
AI economy has stabilized
- GTGarry Tan
Yeah, I was thinking about things that surprised me in 2025, and I think perhaps the thing that most surprised me is the extent to which I feel like the AI economy stabilized. Like, I feel like when we did this episode at the end of 2024, it felt like we were still in the middle of a period of incredibly rapid change, where the ground was shifting under our feet, and, like, nobody knew-
- HTHarj Taggar
Mm-hmm.
- GTGarry Tan
... when the other shoe might drop and, like, what exactly was gonna happen with startups in AI and the economy. Now I feel like we've kind of settled into, like, a fairly stable AI economy, where we have, like, the model layer companies and the application layer companies, it seem, and the infrastructure layer companies. It seems like everyone is going to make a, a lot of money, and there's kind of, like, a relative playbook for how to build an AI-native company on top of the models. I feel like things really kind of matured in that way, like much better.
- HTHarj Taggar
Which feels as though downstream of, like, the models themselves have incrementally improved this year, but there haven't been, like, major steps forward that have shaken everything up, which is ... has a knock-on effect. Many episodes ago, we talked about how it was ... felt easier than ever to pivot and find a startup idea 'cause if you could just survive, like if you could just wait a few months, there was likely going to be some, like, big announcement that would completely make a new set of ideas possible and create more opportunities to build things. It de- certainly feels like that has slowed down, and so, like, finding ideas is sort of returning to sort of normal levels of difficulty, in my experience in office hours.
- GTGarry Tan
I agree.
- 23:38 – 25:49
The AI 2027 piece
- GTGarry Tan
I'll tell you what's not a surprise. (laughs) Do you remember that report AI 2027, where it was just sort of, like, this doomer piece that said-
- DHDiana Hu
Mm-hmm.
- GTGarry Tan
... like, "Oh, well, society is going to start falling apart in 2027"? But, you know, at some point they quietly revised it to say that it wasn't 2027, but they kept the title. Maybe it's not a surprise. Like, I was always a little bit of a skeptic, um, of, like, this fast takeoff argument because even with the scaling law, it is, uh, log linear, so it is slower. It requires, like, 10X more compute, and it's still sort of, you know, topping out, right? And, uh, that's one form of good news. Another form of ... It's weird to call this good news, but human beings, uh, don't like change. (laughs) ... in our previous episode, where we sort of blew up that, uh, MIT report that said that, you know, 98% or 90% of, uh, enterprise AI projects fail. Well, it turns out that 90% of, uh, enterprises don't know how to do, you know, IT, let alone AI (laughs) . It's weird to say that that's a good thing, but in the context of fast takeoff, like, that is a real brake on the ability of this new, really insane technology from actually permeating society. I love to accelerate, but like, it's weird to say, like, "Oh, well, actually, in this case, maybe that's a good thing," right? Like, it is a shockingly powerful technology, but you know, between being log linear scaling and human beings really don't like change, like organizationally speaking, society will absorb this technology. Everyone will have enough time to sort of process it. Like, culture will catch up. Governments will be able to respond to it, not in, like, a frantic SB-1047 sort of like, you know, "Let's stop all the compute past 10 to the 26th," right? Like just these, these knee-jerk responses to technology. We're excited about, um, the ARC AGI Prize is, uh, you know, going to come in and do the Winter '26 batch as a nonprofit. The funny thing about that is like, yeah, maybe there's, um, a team right now that is climbing the leaderboard of ARC AGI, and they're going to accelerate this thing again.
- 25:49 – 29:02
Founders still need to hire teams
- GTGarry Tan
- HTHarj Taggar
Something that surprised me as it r- relates to that with the startups is, I remember around this time last year, we were talking about how companies are getting to a million dollars ARR and raising Series A's without hiring, like some cases not hiring anyone, just the founders, maybe hiring one person, which just felt very unusual. I feel like a year on, that hasn't translated into, okay, and then they went and hit like $10 million ARR or they, they, they scaled without adding any more people too.
- MSMichael Seibel
No, they turned around and started h-
- HTHarj Taggar
(laughs)
- MSMichael Seibel
... and started hiring, like, actual teams (laughs) .
- HTHarj Taggar
Yeah, post- (laughs) yeah. Like, post-Series A, it actually largely feels like the playbook is the same, and the companies might be smaller for the same amount of revenue, but it feels it's entirely because they hit the revenue so fast and they're still just bottlenecked in how long it takes to hire people, versus they have demand for less people.
- GTGarry Tan
I still think there is like, uh, you know, some effect, but it is not, like, open and shut. It is not like you don't have to hire executives anymore. I think there are like ... there might be a case of two foie gras startups, like one being Harvey and the other one being OpenEvidence, right? Harvey, the founders are incredible. Uh, they were, you know, very early, and then there's a sort of idea of like, for VCs, you could just go down Sand Hill Road and like, the fix is in. Like, you just sort of block out all of them and then all the people, you know, the may- maybe 30 people who could write checks of like $10 to $100 million. And if you just sort of get all of their money, like, there's sort of no one who can actually come in and do the next Series A, and then basically you're safe. Like, you have capital as a bludgeon is capital as a moat in that case, right? So yeah, Harvey is interesting because, you know, uh, Ligor is coming fast for them, and obviously we have some skin in the game on Ligora, but we think that they have, uh, as good a shot at any-
- MSMichael Seibel
I guess that's one trend that we saw in 2025 is that there was like a first wave of like AI native companies like Harvey.
- GTGarry Tan
Who might have wasted a lot of money on fine-tuning, actually.
- MSMichael Seibel
(laughs) Totally. That like broke out really big in 2023 and kind of did a victory lap that, you know, "Oh, we've won the, the space," and now we're seeing a second wave of companies like Ligora and Giga and it turns out that like, oh, actually, like, it isn't so simple.
- GTGarry Tan
Yeah, the weird beneficiary of, um, you know, burning some non-trivial double-digit percentage of your capital stack on fine-tuning that buys you no advantage is like basically the investors are the only winners there (laughs) 'cause they just own more of your company, you know?
- HTHarj Taggar
Yeah, so at least as it relates to like the, the hiring and team size, I feel like of the two camps, one being that AI is gonna make everything more efficient, you will need less people, and the other, AI is gonna reduce the cost of like producing the time to produce things and so then the expectations from your users and customers will just go up and you'll need to keep hiring more people to satisfy like the growing expectations. I feel like this year has been more in that second camp, and I think that is what's driving the fact that the companies are still just hiring as many people as they were pre-AI. It's just like the bar for what the soft- what their customers expect. And they're all in the, you know, like Ligora's racing with Harvey, Giga's racing with Sierra. Like, they're all still competing for the same set of customers, and they still ultimately are bottlenecked on like people and ... Like, I don't think anyone's bottlenecked on ideas, but they're bottlenecked on like people who can execute really well. I don't know. I th- I think that's like still, it's, it's exciting. It feels like an exciting phase.
- GTGarry Tan
I agree
- 29:02 – 30:22
Outro
- GTGarry Tan
with you that like the, uh, era of the one person running a trillion dollar company is not here.
- HTHarj Taggar
Not yet.
- GTGarry Tan
Yeah. But I think it's gonna trend that way eventually. That'll be a wild time. Maybe that's a prediction for-
- NANarrator
For next year?
- GTGarry Tan
... 2026. Yeah.
- HTHarj Taggar
Do you think it's coming in 2026?
- GTGarry Tan
I mean, I don't think it'll happen in 2026 either, honestly. I mean, I think you will have many stories of companies run by, you know, under 100 people that are making hundreds of millions of dollars.
- HTHarj Taggar
Yeah.
- GTGarry Tan
So ... I mean, Gamma was interesting to see. Like, uh, one of the biggest things that they said in their launch that I think is a very good trend is they said they got to $100 million in ARR with only 50 employees.
- HTHarj Taggar
Yeah.
- GTGarry Tan
So which is a very different ... It's, you know, such an inversion, right? Like normally you have the big banner and-
- HTHarj Taggar
Yeah.
- GTGarry Tan
... the like little X thing- you know, image and it's like, "Oh yeah, like we raised all this money and look at all the people who work for us." (laughs) It's a good trend to have the reverse flex, which is like, "Look at all this revenue and look how few people work for us." Well, that's all we have time for this time. We just wanted to wish you a really happy holidays and happy New Year from all of us to you and yours. See you next time.
Episode duration: 30:22
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode cqrJzG03ENE
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome