Y Combinator2024: The Year the GPT Wrapper Myth Proved Wrong
EVERY SPOKEN WORD
35 min read · 7,307 words- 0:00 – 1:00
Coming up
- GTGarry Tan
The wildest thing right now is you can start a company that can make tens of millions of dollars, literally in 24 months, and, uh, you can do it for potentially, you know, $2 million, $5 million.
- HTHarj Taggar
A year ago, I remember many of the startups in the batch would get sort of enterprise proof of concepts, or pilots in particular, and there was a lot of cynicism around whether any of those pilots would translate into real revenue. Fast-forward a year, I think we have all firsthand experience that these pilots have turned into, like, real revenue.
- GTGarry Tan
It's still early days, honestly. Like, it... you know, we sort of breathe a sigh of relief right now in 2024, but it's anyone's game, honestly. Like, these things are moving so quickly. Welcome back to another episode of The Light Cone. I'm Gary. This is Jared, Harj, and Diana. And collectively, we funded companies worth hundreds of billions of dollars right at the beginning. So,
- 1:00 – 13:55
What made 2024 great for startups
- GTGarry Tan
2024, what a year. How are you feeling about this, Harj?
- HTHarj Taggar
Pretty great. I think this is the year that everything broke in favor of startups. What I've been thinking about a lot recently is when ChatGPT launched, two years ago now, the immediate consensus view was all of the value would go to OpenAI. And very specifically, do you all remember when they announced the GPT or the ChatGPT store?
- GTGarry Tan
Yeah.
- HTHarj Taggar
Like, the... I remember the consensus was everything that was built on to- top of ChatGPT was a GPT wrapper, and the App Store was just going to relea- be released and crush every single person trying to build an AI application, and OpenAI would be a ginormous company, but there'd be no opportunity for startups. It sounds kind of ridiculous to say that now because, um...
- GTGarry Tan
(laughs) Who even remembers the ChatGPT store?
- HTHarj Taggar
Yeah, exactly.
- GTGarry Tan
(laughs)
- HTHarj Taggar
The ChatGPT store itself was a nothing burger, but, like, mo- more importantly, what are the big AI applications today? Like, I'd say, outside of ChatGPT itself, the breakout consumer application is Perplexity. The breakout enterprise application is probably Glean, maybe. Um, in legal tech, you have Casetext, you have Harvey. Pro-sumer, you have PhotoRoom. Like, there's... the point being, there are many, many applications that have been built not by OpenAI. It's been a great time to build startups.
- GTGarry Tan
Yeah. The wildest thing right now is you can start a company that can make tens of millions of dollars, uh, literally in 24 months from zero, and, uh, you can do it for potentially, you know, $2 million, $5 million. That's sort of the story of one of these companies, Opus Clip, which never had to raise a real series A, and that's something that we sort of see across the YC community as well.
- HTHarj Taggar
Yeah. I think that's, um, a particularly important point that you can do it as a startup without ris- raising tons of capital, because post the GPT store launch, I then remember, uh, Anthropic and Claude emerged, and the consensus view for a while was all of the value's going to go to one of these foundation model companies, and that the only way you can compete in AI is to raise huge amounts of money, um, either because you've got venture capital, or you're Amazon or Facebook or Google, um, with tons of cash already. But if you weren't one of the big foundation models, um, there would be no value. And the applications built on top of these things would either be built by the foundation model companies themselves or just not be that valuable. Again, something that turned out to be completely not true, right? And in particular, what drove that is open source. Like, the weird series of events where Meta... (laughs) Like, the weights being leaked and, like, Meta just, like, rolling with it.
- DHDiana Hu
On torrent.
- HTHarj Taggar
Yeah, right. (laughs)
- DHDiana Hu
That's going to force the hand for, uh, Meta to launch, uh, LLaMA, which is funny. And people thought, "Oh, it was just this cool open source model," but it was 18 months behind OpenAI, and people started doing a lot of derivative work out of it. It's like vicuna and all these other animals related to llamas that came out-
- HTHarj Taggar
(laughs)
- DHDiana Hu
... and it took the... Ollama is one of the companies at YC as well-
- GTGarry Tan
Yep.
- DHDiana Hu
... that enabled people to do, uh, local kind of Docker development, like, models running on device. It was pretty cool, but people didn't think that, um, they were e- going to be able to catch up. And the thing that changed from 2023 to 2024 is that during the summer, it was a turning point. It was the first time that the top foundation model in all the rankings, benchmarks was LLaMA, and that was a shock to the community.
- GTGarry Tan
Yeah, so it turns out choice, uh, matters, (laughs) and choice means-
- DHDiana Hu
(laughs)
- GTGarry Tan
... that it's not as much about the model. I think the model still matters quite a lot, um, but once you have choice in model, it means you can't have this sort of idea of monopoly pricing. And you have that model, your competitor also has that model, but all the other things seem to end up mattering a lot more, which is product, your ability to sell, your ability to actually adjust to user feedback, your ability to get to zero churn. All of those suddenly become far more important than capturing a light cone of all future value (laughs) through the model.
- HTHarj Taggar
Right. A- a very specific way I've- I've felt this is I remember a year ago, working with startups in the batch that were essentially building model routers, just like an API to call, like, a specific model. And I remember the, um... A lot of the motivation for that at the time was, uh, reducing cost. It was like, oh, like, you don't want to just, like, burn up all of your ChatGPT calls. You want to spread them out across, like, various different models. And the- the- the argument against that was just, oh, like, the cost of all this stuff is going down to zero anyway. Like, there's no value to be had in being, like, a model router, and no one wants to build their applications with a model router. They're all just gonna call whatever's the best model. I think fast-forward a year, like, that's totally not true. From what I can tell, the- the model router was actually a really great entry point into just building sort of a new stack for building LLM-powered apps, and most of the- the applications we're seeing, I think they just don't want to be beholden to a specific model. Does that map with what you've seen?
- DHDiana Hu
Yeah, actually...One of the things we've seen now in the fall batch that just presented a demo day, which was one of the trends that shifted from summer '24 and winter '24, was precisely what you're saying. Companies started to use multiple models for the applications, like the best one for speed at some point, because sometimes you need to parse a lot of the input very quickly. It's fine if it's a bit more lossy, and then you need the bigger model to handle the more complex task. So, a lot of companies in fall '24 have this actually multiple model architecture to use the best one for the best task, which is similar to the concept of the model router, but it was, uh, the idea evolved. Instead of being, being more of a routing, it was more of a orchestration. I think a concrete example we gave, uh, couple episodes ago was, uh, Camphor.
- GTGarry Tan
Yep.
- DHDiana Hu
It was a company you work with. They use the fastest model for parsing their PDFs, and the more complex ones, they use 01, and that's, that's how it's done. And other company that's doing, uh, fraud detection, they have this concept of a, like a junior risk analyst, where they just use like a fast and easy GPT-4 Mini, and then they use the bigger one with like, 01. Or the other example is, um, I think Cursor talks about it in their episode with, uh, Lex Fridman, they also have this complex multi-architecture with multiple models, and this is why it works well, is like they do one very specific for predicting where you're going to type next, but one for understanding the whole code base. So, very different tasks. So, that's definitely happening now.
- GTGarry Tan
Yeah, the other thing that, uh, popped up for fall batch, there's a company I'm working with, um, called Variant, and, uh, what they're trying to do is take basically state-of-the-art open source LLM models that can do code gen, and then teach them aesthetics.
- DHDiana Hu
Huh.
- GTGarry Tan
So, uh, starting with icon generation, and so they built this huge sort of post-training workflow that should work on, you know, as the open source models get smarter and better at code gen broadly, they can just, you know, take the next version of that, and then, uh, take their post-training architecture and data set, and then basically teach a given model aesthetics. So, what a certain thing is supposed to look like, and, uh, not in a diffusion sort of way, but actually at the SVG level, and we think SVG will actually translate into all kinds of aesthetics. So, it's an interesting approach, and like, one of the newer ones in that post-training is a whole coherent way to sort of skip the whole idea that, um, all of the value is like accruing into the model, especially because of open source, to your point.
- HTHarj Taggar
The other thing I've, I've been having flashbacks to, is a year ago, I remember many of the startups in the batch would get sort of enterprise proof of concepts, or pilots in particular, and there was a lot of cynicism around whether any of those pilots would translate into real revenue. Um, lots of parallels to crypto and how anytime there's some new interesting technology, blockchain more specifically than crypto, but anytime there's a new technology, enterprises always want to run pilots and POCs because it's someone's job to like check off, yeah, we did the like hot new technology thing.
- GTGarry Tan
The chief innovation officer must have his due.
- HTHarj Taggar
(laughs) We've spoken about this, one of our, one of our episodes, I think, and, and fast-forward a year, I think we have all firsthand experienced that these pilots have turned into like real revenue, and if anything, it, the startups in the YC batch now are going to sell into real enterprises faster than they have before, and are ramping up revenue and reaching milestones like $1 million ARR faster than I've certainly ever seen.
- 13:55 – 15:37
Tech and gov’t intersecting more
- JFJared Friedman
of dodged a bullet there with, uh, 1047 and, uh, it looks like some of the Biden EO, uh, is not that likely to survive the Trump White House. TBD what that means in the longer term, but certainly one of the things that we were very worried about was that some certain amount of math-
- DHDiana Hu
(laughs)
- JFJared Friedman
... beyond a certain level (laughs) would suddenly become illegal or require registration at your local office.
- HTHarj Taggar
(laughs)
- JFJared Friedman
(laughs)
- HTHarj Taggar
It's certainly been a weird time to be in tech because I've never experienced, um, s- software and technology intersecting with politics so much, and in particular, I'm not used to genuinely caring about national politics affecting startups in a YC batch, or just, you know, companies that are less than a year old. But it did really, it did really for a moment was worrying. It wasn't clear whether the startups would actually be able to build innovative AI applications versus suffering from regulatory capture from OpenAI and a few big players. We're obviously very glad it broke (laughs) in favor of startups.
- JFJared Friedman
(laughs) Seems like we're still in the early game, right? I mean, it's very easy to see that, um, the platforms themselves really will or could possibly resemble, you know, the Win32 monopoly, right? Windows, uh, has access to the APIs. They, in fact, know all the stats about what's working on their platforms, and guess what? They can build it into their platform. You know, we sort of breathe a sigh of relief right now in 2024, but, um, you know, it's anyone's game honestly. Like, these things are moving so quickly. I wouldn't totally breathe your last sigh of relief yet.
- HTHarj Taggar
(laughs)
- JFJared Friedman
You know? It's-
- HTHarj Taggar
Yeah, that's fair.
- JFJared Friedman
We, we got to keep working on this.
- HTHarj Taggar
Okay, so it's clearly been
- 15:37 – 20:48
Who else in tech had W’s in 2024?
- HTHarj Taggar
a great year for startups. Um, what else has been happening? Who else has it been, um, a great year for do we think? I mean, there's certainly been some big funding rounds, right? Like, OpenAI unsurprisingly has raised huge amounts of capital.
- DHDiana Hu
Scale.
- HTHarj Taggar
Yeah, even within YC though we've seen, like, Scale AI-
- JFJared Friedman
Yeah.
- HTHarj Taggar
... has really broken out this year.
- JFJared Friedman
$6 billion for OpenAI, $1 billion for, uh, Scale, $1 billion for SSI, the new Ilya Sutskever startup.
- HTHarj Taggar
Scale I think is just worth talking about because it's such a classic startup story. I mean, you were there in the early days, right? You interviewed them for YC.
- JFJared Friedman
Yeah.
- HTHarj Taggar
Um, wh- um, tell us what the idea was that they interviewed with and how they ended up landing on what, you know, is probably one of the best startup ideas of the last 10 years.
- JFJared Friedman
The fun thing about the Scale.AI story is it, it is the sort of epitome of the, like, classic YC startup story. You know, there's other kinds of startups that get started, you know, like SSI for example. That's not a typical YC startup story where, like, some very well established people raise $1 billion with, like, a PowerPoint pitch. But, like, Scale.AI is, like, the classic story of how, like, young programmers can just gradually build a, like, $10 billion company over time by being, like, smarter and harder working than anybody else. And so yeah, when Alex, uh, interviewed at YC, he wasn't working on anything related to AI. It was a completely different idea, and the idea for Scale.AI kind of got pulled out of him by the market, and it's, it's actually still, like, several pivots 'cause, like, uh, the, the original idea at YC didn't have anything to do with AI, and then for a long time he was basically doing data labeling for the self-driving car companies.
- HTHarj Taggar
Uh, they applied, as I remember, they applied with, like, a healthcare related idea before-
- JFJared Friedman
Yeah, it was a website for booking doctor's appointments.
- HTHarj Taggar
Okay. (laughs)
- DHDiana Hu
(laughs)
- HTHarj Taggar
Yeah, cool.
- JFJared Friedman
(laughs)
- HTHarj Taggar
And then they pivoted during their batch.
- JFJared Friedman
Yeah.
- HTHarj Taggar
Um, do you remember how they came up with the data labeling idea? 'Cause this must have been, what was this, 2016?
- JFJared Friedman
Yeah. The way they came up with the data labeling idea was that Alex had worked at Quora, and Quora had to do some data labeling for, like, moderation and stuff, and so at the time the big data labeling service was Amazon Mechanical Turk, and they were deemed unbeatable because they were, like, run by Amazon and Amazon could throw infinite money at it, and it was always at scale, it was at, like, v- like, quite large scale already. But Alex had a unique insight, which is he had actually used Mechanical Turk at Quora and he knew that it kind of sucked to actually use it. And so he, he, he had this sort of, like, unique insight on the world and so he just tried to build a better Mechanical Turk, basically the version he would have wanted when he was at Quora.
- HTHarj Taggar
And as I remember it, like, really they, their early traction came almost entirely from one customer, Cruise, right?
- JFJared Friedman
Cruise, yes, which needed to do tons of data labeling on all the images that the cars were taking as they were driving around San Francisco. You got to, like, draw a circle around the traffic light and things like that. And the, the cool thing about Scale is that they've actually caught two waves. So they, you know, accidentally caught the first wave of all the self-driving car companies because ML took off at that time in computer vision, there was just an unprecedented demand for labeled data for training sets that just hadn't existed before, and so they were able to ride that wave. And then as that wave was, like, cresting-... LLMs got big, and all of these companies needed to do RLHF at very large scale. And Scale was just, like, perfectly positioned to move into that business as well.
- HTHarj Taggar
Yeah. They, I think the Scale story is just so interesting 'cause it was pre-LLM, it was, it was clearly a multi-billion dollar business anyway.
- JFJared Friedman
Yeah.
- HTHarj Taggar
And LLM, it caught the LLM wave which has now propelled it into probably, it's going to be like a $100 billion plus company. And I'm seeing that at the ground level too, where many companies I had that maybe finished the batch or even pre the batch didn't have an idea, pivoted into an AI idea that's taking off. Like, I'm just seeing much more success in founders who waited out and can find an idea that they just couldn't before. I have a company from a year ago, they pivot, they pivoted their whole batch, they couldn't find, um, a great idea. It actually took them six months after the batch until they realized, um, one of their parents ran a dentist office, so he just decided to go hang out at the office to see, like, if there was anything he could automate. Uh, and they just ended up building an AI back office for dentist offices.
- JFJared Friedman
Love it.
- HTHarj Taggar
And now it's just like, that we have a week growth is fantastic. It's, it's doing really, really well, and I'm seeing lots of cases like that spring up.
- DHDiana Hu
Definitely seeing that as well. I think there's something about the advantage of having all these very hardcore young technical founders that are willing to kind of just bet the farm and go all in on just a little bit of a glimmer of, "Oh, this is where the future's going to be. Let me just try it," and then it actually ends up working. Like your story with the dentist, I have a lot of teams that pivoted as well into different spaces where they kind of found that glimmers, like, "Oh, computer use came out," and I have a couple companies that are working and betting and going in that direction, and it's, like, working well. I mean, it's still early. I mean, this is just the fall batch, but that's cool too.
- HTHarj Taggar
Okay, so what are some of the a- the trends that we've seen or some of the, the specific trends and waves that startups have been riding coming out of the batches? Voice AI is something we've
- 20:48 – 23:17
Voice AI has a lot of potential applications
- HTHarj Taggar
talked about. It's clearly, um, maybe the most promising vertical for AI right now in terms of just raw traction.
- GTGarry Tan
Do you think, uh, voice is a winner take all, or will it be something that has sort of 100 different verticals that are very tailored to those specific verticals? That's literally one of the questions I get from some of our, uh, voice AI startups themselves. They're like, "Should I be going horizontal or should I just continue to grow win- within my vertical?"
- HTHarj Taggar
It feels to me like voice itself, voices, um, I- and just like AI, it touches everything and there's so many different applications for it that, um, you can... There's probably infinite applications to build where voice is the interesting element of it. I mean, things that just spring off top of your head, like language learning applications. I'm sure there's not going to be just one really cool voice AI powered language learning application.
- JFJared Friedman
Yeah.
- HTHarj Taggar
There's probably going to be multiple of them. Remote work, like, um, teleconferencing is probably like a whole other area where there's interesting things to do with voice AI.
- DHDiana Hu
And even within customer support, we highlighted a number of companies we talked about last time, um, company PowerHelp, Kappa.ai.
- JFJared Friedman
Yeah.
- DHDiana Hu
There-
- JFJared Friedman
It, it, it turns out that customer support is not really one vertical. There's, like, many different flavors of customer support and there's, like, very different on the inside once you get into the details.
- DHDiana Hu
Because I think there's very specific types of workflows you need to do per industry, and that's to the point of why vertical AI agents are going to really flourish. I mean, same thing for voice. It's just very different workflows if you're building the, I don't know, the voice agent to do customer support for an airline, very different than doing it for a bank, very different than doing it for a B2B SaaS company, et cetera.
- GTGarry Tan
Yeah, I guess that question of is there going to be, um, pure horizontal integration is sort of like saying, "Will there only be one website?" (laughs)
- HTHarj Taggar
Yeah. (laughs) Uh, it'd be like saying, like, there, there's just going to be both. There'll be horizontal infrastructure companies that do really well in vertical applications. I- it, because it, uh, to say otherwise would be like saying, oh, like, Stripe powers payments on the internet and it's also just going to have all the most valuable applications that accept payments on the internet. It's just not how it works. Like, there's enough value at just being the horizontal infrastructure layer. So I'm sure there'll be great voice AI companies that just make it really easy for you to build your own voice AI application, while there'll also be hundreds of really valuable vertical apps.
- DHDiana Hu
What are, what are, what are other trends that we've seen besides, uh, voice?
- HTHarj Taggar
We were talking about robotics earlier.
- 23:17 – 25:57
Robotics is on the rise
- HTHarj Taggar
They're certainly, we, we are certainly working with more founders building robots this year than I think any year-
- JFJared Friedman
Ever in history, yes.
- HTHarj Taggar
(laughs) ... ever. Uh, what's driving that?
- GTGarry Tan
I have an ex-Apple team that's called Weave Robotics that they're going to try to ship a real robot in 2025. It costs about $65,000, $70,000, (laughs) but that's actually what it costs to have the actuators and the safety needed to actually have it work in your home. I think it's actually driven by this idea that, um, the LLM itself can be sort of the consciousness of the robot. Like, "Am I doing this thing that, you know, my owner needs me to do?" Um, you know, "How do I actually interact with them and the other people in the household?" Uh, but it's funny because then, the voice language action model that might actually do a certain thing, like fold laundry, that's, uh, almost tool use inside of the broader LLM consciousness. So, I feel like that's one of the things that I'm excited to see, you know, will it really work? And I think we're going to find out this year.
- DHDiana Hu
I guess the way I think about it, robotics is basically half AI and half hardware. Half of the part of the equation is starting to work.
- GTGarry Tan
Well, the hardware's still hard. (laughs)
- DHDiana Hu
Yes.
- GTGarry Tan
The hardware is still very expensive. Yeah, there's still, there's some evidence that, uh, being able to actually do laundry, for instance, like, that might be one of the first things that gets shipped.
- HTHarj Taggar
I think the dream case for startups is going to be that you can build just the, the AI or the software piece of it and run it on commodity hardware and do really great things. The opposite case would just be actually if the two thing- that you need to be good at the hardware and the software and they, like...... coupled together, and you need to produce both. And, y- you would expect Tesla to be (laughs) the obvious-
- GTGarry Tan
(laughs) .
- HTHarj Taggar
... like, winner in this space. And it remains to be seen. I'm pretty optimistic. We have multiple companies I feel that are trying to be creative on how to run the models on commodity hardware for specific use cases.
- GTGarry Tan
It still feels early. It, it feels like the robotics hasn't quite hit its, like, ChatGPT moment yet.
- HTHarj Taggar
No.
- DHDiana Hu
Maybe the moment is self-driving cars have been working in San Francisco. I don't think it's talked enough.
- GTGarry Tan
People who don't live in San Francisco or, like, often don't realize the extent to which these are fully deployed in San Francisco, and r- regular people are riding them every single day.
- HTHarj Taggar
Yup. I saw Tony from DoorDash recently, and he said he exclusively uses Waymo, like, everywhere.
- GTGarry Tan
Hmm.
- HTHarj Taggar
I live in Palo Alto and have no option for it, but I-
- GTGarry Tan
(laughs) .
- HTHarj Taggar
... would love to. It'd be amazing.
- GTGarry Tan
I mean, the wild thing is there are only a few thousand of these deployed right now in the entire world, and how lucky is it-
- DHDiana Hu
They're all in San Francisco.
- GTGarry Tan
Yeah. What about big flops for 2024?
- 25:57 – 27:54
What were the big flops of 2024?
- HTHarj Taggar
(laughs) .
- GTGarry Tan
I, you know, I seem to remember that we, uh, started one of our light cone episodes-
- HTHarj Taggar
(laughs) .
- GTGarry Tan
... all wearing, uh, Apple Vision Pros and-
- HTHarj Taggar
(laughs) .
- GTGarry Tan
... Quests. And, uh, we have not talked about AR since. (laughs) Diana, what happened?
- DHDiana Hu
(laughs) . Uh, it hasn't happened. There's this moment for a lot of the hardware that needs to be a lot more lightweight. Like, we need to get to this form factor, but there's actually constraints with physics to fit all that hardware in such a small form factor. And in order to have enough compute and the optics to fit, it is super challenging, and I think there's still more actual engineering and physics that needs to be discovered. And that's, that's it. I, I think the algorithms are there, but it's just lots of really hard hardware and optics problems.
- GTGarry Tan
And it's a, it's a tough chicken and egg problem, because there's not enough hardware in people's hands for it to be worth it for app developers to build apps, and so there's not enough apps for people to want to buy the hardware. (laughs) . It's just-
- HTHarj Taggar
And I feel like the people-
- GTGarry Tan
... very-
- HTHarj Taggar
... who did buy it, like, the, the killer application so far seems to be using it as, um, uh, a really large monitor. (laughs) . Um, uh, but it doesn't-
- GTGarry Tan
And it does work very well for that.
- HTHarj Taggar
(laughs) .
- DHDiana Hu
For watching movies.
- GTGarry Tan
Yet, you, you, you've actually retained this as a user, Gary, right?
- HTHarj Taggar
There's some uses.
- GTGarry Tan
Yeah.
- DHDiana Hu
Yeah. Yeah.
- GTGarry Tan
(laughs) .
- DHDiana Hu
Maybe the one device that I think I actually have been playing and actually feels good is actually the Meta Ray-Ban.
- HTHarj Taggar
Oh, yeah.
- DHDiana Hu
It doesn't have any of the actual displays, but I really like it for the audio-
- GTGarry Tan
Yeah.
- DHDiana Hu
... and voice. And one workflow I've been trying out is actually using the Meta Ray-Ban and connect it to, uh, any of the voice modes for either ChatGPT or Claude, and kind of have a conversation with it-
- GTGarry Tan
Hmm.
- DHDiana Hu
... about a topic.
- HTHarj Taggar
Oh, I haven't tried that.
- GTGarry Tan
It's just with you all the time.
- HTHarj Taggar
That's an interesting idea. Yeah, yeah. That's a great idea.
- DHDiana Hu
That is, like, a fun thing that I've been doing and just chatting with myself. Maybe look a little bit like a crazy person-
- 27:54 – 29:00
AI coding really broke out in 2024
- DHDiana Hu
Oh, yeah.
- GTGarry Tan
2024 was the year that AI coding really broke out.
- HTHarj Taggar
Yep.
- GTGarry Tan
I mean, we had the majority of, of YC founders now use Cursor or other AI IDEs. They just, like, exploded over the summer. Devin proved that you could, like, fully automate, like, large programming tasks. Yeah, all that was this year. That's pretty wild.
- HTHarj Taggar
Replit agents continue to improve. Like, I'm hear more anecdotal stories of people building Replit apps on, like, their way home from work-
- GTGarry Tan
Yeah.
- HTHarj Taggar
... and being really impressed.
- GTGarry Tan
Like, Replit took this technology and popularized it among, like, non-technical people for the first time. That's really crazy.
- DHDiana Hu
And an even more, uh, lower technical version is, uh, Anthropic's, uh, Artifact, where you can actually prototype very simple apps and chat with Claude to build really fr- simple front pages. And then you can prototype stuff as a PM and show it to your engineering team, and it's like a full-fledged working version.
- GTGarry Tan
Yeah, it's wild, because it just means that you can, one person can do so much more. And do you think it's going to change the nature of how startups are
- 29:00 – 34:43
Is startup hiring going to change?
- GTGarry Tan
actually hiring? Are you seeing this yet? Like, some of the founders I've met who recently raised their seed rounds coming out of YC, um, they're not really approaching it how maybe the classic advice would, uh, teach them. In the pa- you know, in the past, you might say, "Let me try to find, you know, more, let me try to hire more people. Like, I, you know, there are certain tasks that normally I have to find, you know, the person who did it at my competitor who did all of customer success, and I need to find that person who's under the person who runs that function, and I've got to hire that person and promote them. And, uh, they're going to come with all this knowledge and people networks." Some people are saying sort of the opposite, which is, "I'm going to get my software engineers to write more processes that use LLMs up front. And, you know, I probably will end up needing to hire that person, but maybe after the series B or C and not right now."
- DHDiana Hu
Yeah, I think I've seen that as well with companies after the batch, where they're looking for engineers that have more upside, and they're really fully native with the set-up, with the AI coding stack. And part of the, one of the clever interview tricks I've seen is people do pair programming and watch them use the tools. And you can really tell if someone really has tinkered with them. There's actually an engineer at Cat that, that is not only good at coding, but also prompting and telling with, when the AI output is not correct. I think the part of reading and evaluating the output of all these AI coding agents is actually a lot more critical.
- GTGarry Tan
Yeah, there's n- been an interesting controversy this past year about AI coding agents and programming interviews, because AI coding agents basically broke (laughs) -
- DHDiana Hu
Mm-hmm.
- GTGarry Tan
... the standard programming interviews that companies have been doing for years. Actually, Harjit, I'm curious what you think about this, since you ran a-
- DHDiana Hu
(laughs) .
- GTGarry Tan
... programming interview company. (laughs) .
- HTHarj Taggar
Yeah. (laughs) . Uh-huh. I mean, I guess the, the interesting debate is whether you should...... penalize or prevent people who are interviewing at your company from using Cursor or one of these tools, um, to s- ace your programming interview, or whether you should just lean into it and adapt and, uh, test to see how productive they are. Um, I generally think the way these things tend to err is more in that direction, that I think it will just become, you'll just be measured on your absolute output and the bar will go up. I think, like, Stripe, for example, were early on this about a decade or so ago, where they recognized that so much of what they needed their programmers to do was, like, build web- web application and web software and not do hard CS problems. And so, the industry shifted away from the Google-style interview of lots of computer science problems and whiteboarding to just give someone a laptop and l- make them build, like, a to-do app in, like, four hours. So I think we'll just see the same thing happen, where people will just- the industry will just adjust and you'll just be interviewed using these tools and just be expected to do a lot more in, like, a two-hour interview than you are today. To your point, Gary, around just the- the startups, like maybe how many people they need to hire or just, like, how- how do they scale? I- it seems too early to see, like, dramatic effects on that yet, but one thing that I'm interested in is, uh ... I- I saw watching an interview with Jeff Bezos recently, and he said that, well, one, he's back at Amazon working on AI, and two, that apparently Amazon itself has like a hundred, or maybe it was a thousand, it- it was a surprisingly large number of internal LLM-powered applications, presumably to just run Amazon. The last time Amazon took something, it ran for internal infrastructure (laughs) and released it to the world was AWS, which completely changed how startups are built. So I'm curious to see if they have interesting applications to run Amazon internally that they'll just release out and suddenly, like, there'll be new stacks to just build and scale your companies on, and we'll see the whole, uh, something that we talked about in recent episodes of the 10-person, the one-person, uh, unicorn.
- DHDiana Hu
One of the applications they did talk about is they did this giant migration for a old version of programming languages. Whenever you need to upgrade different versions of databases, et cetera, it's like a lot of work and they use LLMs for it.
- HTHarj Taggar
Yeah.
- DHDiana Hu
It was like changing hundreds of thousands of lines of code and it would have taken a engineering project of six months or more. It was done in weeks.
- JFJared Friedman
I mean, Amazon's just such a perfect use case for like LLM-powered agents doing back office processes. They must have just like-
- HTHarj Taggar
Yeah.
- JFJared Friedman
... just an absolute goldmine of opportunities. (laughs)
- HTHarj Taggar
Yeah.
- DHDiana Hu
And they just launched their, uh, big foundation model, actually, that is, uh, starting to be top in some of the benchmarks as well. So I think they're trying to be another contender through this race.
- HTHarj Taggar
Yep. (laughs)
- GTGarry Tan
That's interesting because, uh, like from the bottom up, like certainly from some of the people who still work at Amazon, maybe right out of college, many of them do not have access to LLMs or are actually barred from using it from- in their day-to-day. So-
- HTHarj Taggar
(laughs)
- GTGarry Tan
... y- you know, maybe that's one of the downsides of organizations. When they get big enough, um, you know, the future is already here, but it is not evenly distributed, even within the same organization.
- HTHarj Taggar
Interesting.
- GTGarry Tan
But that bodes sort of well for, uh, both open source and, uh, sort of self-hosting LLMs. Like I- it's on my to-do list to build my own stack of Apple minis-
- HTHarj Taggar
Yeah.
- GTGarry Tan
... and, uh, run LLaMA on my own little cluster on my desk.
- HTHarj Taggar
I bought all the hardware to build my own-
- DHDiana Hu
Yeah.
- HTHarj Taggar
... machine, but then we had a baby and it hasn't happened yet. (laughs)
- JFJared Friedman
(laughs)
- GTGarry Tan
(laughs)
- HTHarj Taggar
But maybe at some point.
- 34:43 – 36:50
YC in person Demo Day is back!
- GTGarry Tan
you know, YC has been operating back in person at San Fran- in San Francisco for some time, but we got a real live Demo Day-
- DHDiana Hu
Yeah.
- GTGarry Tan
... all the way back. So no more Zoom Demo Days, no more Zoom Alumni Demo Day. You know, we did Alumni Demo Day right here in this, uh, office, right, you know, right downstairs. That was awesome. And then we took over the Masonic Center and 1,200 investors, uh, all in one room. It was actually really great for the founders, I thought, because it was about, uh, a third as many founders than the summer batch and it was more than 2X, maybe 3X the number of investors that, uh, then who had came to our investor reception party.
- DHDiana Hu
So it was like a ratio of, uh, 10 investors for one company, roughly.
- GTGarry Tan
Yeah. So, uh, I think all of them had a really good time.
- JFJared Friedman
I'd almost forgotten how great the energy of an in-person Demo Day is. Like, it's just not something that you can replicate over Zoom. The YC Demo Days also always acted as the de facto investor reunion in Silicon Valley because it's the one event that all the investors would reliably show up at. And so they were really excited that we had brought it back because- (laughs)
- GTGarry Tan
(laughs)
- JFJared Friedman
... when- when- when we weren't doing it, there was no (laughs) equivalent event.
- DHDiana Hu
Sort of the homecoming for Silicon Valley.
- HTHarj Taggar
Exactly.
- GTGarry Tan
Yeah, so now it's, uh, you know, four times a year and it's the one time that, uh, all the top early stage investors in the world are going to come back to San Francisco for hopefully that week's festivities, culminating in our Demo Day. So it's a real celebration.
- HTHarj Taggar
I- it feels like in-person in general is back. That's certainly another theme of 2024. Certainly the late stage startups that we've been meeting with and speaking to this year, one of the highest priority items has been figuring out how to get everyone back into person, back into the office. I think the era of it's going to be remote forever is definitely gone. I certainly think-
- GTGarry Tan
Good riddance.
- HTHarj Taggar
Yeah, exactly, right?
- DHDiana Hu
(laughs)
- HTHarj Taggar
And then finally, like, yeah, in-person is back and then San Francisco is back, like, a lot of thanks to you, Gary.
- GTGarry Tan
(laughs)
- JFJared Friedman
(laughs)
- HTHarj Taggar
The elections recently seem to have gone well. Like, there's a lot
- 36:50 – 38:10
San Francisco optimism + outro
- HTHarj Taggar
of optimism, I feel, around San Francisco and- and-
- GTGarry Tan
Yeah, we have a new mayor. Uh, we're hoping that he does the right things and, um, you know, we have a very thin moderate majority on the board of supervisors, uh, but we did get rid of some of the worst people-
- HTHarj Taggar
(laughs)
- GTGarry Tan
... who created a doom loop in San Francisco, so I'm optimistic. You know, we didn't get everything we wanted, but, uh, it's tracking in the right direction. And I think as in startups, as in politics, you always, you know, way overestimate what you get- you will get done in one year, but you always way underestimate what's going to happen in 10 years. I think it's going to take 10 years. It's going to take 20 years. But, um, just as startups went from 15 companies a year that could possibly make it to $100 million a year to 1,500 in any given year, knock on wood, um, that, you know, I think San Francisco needs to be the beacon for all the smartest people in the world. And that- that's actually probably the thing that I'm hope- most hopeful for, is that we can actually just keep building. So from all of us to all of you watching, happy holidays and we'll see you in the new year. (instrumental music)
Episode duration: 38:11
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode z0wt2pe_LZM
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome