Skip to content
All-In PodcastAll-In Podcast

E167: Google's Woke AI disaster, Nvidia smashes earnings (again), Groq's LPU breakthrough & more

(0:00) Bestie intros: Banana boat! (2:34) Nvidia smashes expectations again: understanding its terminal value and bull/bear cases in the context of the history of the internet (27:26) Groq's big week, training vs. inference, LPUs vs. GPUs, how to succeed in deep tech (49:37) Google's AI disaster: Is Google too woke to function as search gets disrupted by AI? (1:17:17) War Corner with Sacks Follow the besties: https://twitter.com/chamath https://twitter.com/Jason https://twitter.com/DavidSacks https://twitter.com/friedberg Follow the pod: https://twitter.com/theallinpod https://linktr.ee/allinpodcast Intro Music Credit: https://rb.gy/tppkzl https://twitter.com/yung_spielburg Intro Video Credit: https://twitter.com/TheZachEffect Referenced in the show: https://www.google.com/finance/quote/NVDA:NASDAQ https://twitter.com/KobeissiLetter/status/1760680756689748478 https://investor.nvidia.com/news/press-release-details/2024/NVIDIA-Announces-Financial-Results-for-Fourth-Quarter-and-Fiscal-2024 https://www.statista.com/statistics/1120484/nvidia-quarterly-revenue-by-specialized-market https://www.google.com/finance/quote/SPY:NYSEARCA?comparison=NASDAQ%3AQQQ&window=5D https://www.marketwatch.com/story/wall-street-keeps-likening-nvidia-to-dot-com-era-cisco-is-the-comparison-justified-eed307c1 https://twitter.com/JayScambler/status/1759372542530261154 https://twitter.com/chamath/status/1760343973632291212 https://x.ai https://twitter.com/TheTranscript_/status/1760436281438314545 https://artificialanalysis.ai https://www.contraline.com/product https://www.cafexapp.com/commercial https://blog.google/technology/ai/google-gemini-ai https://blog.google/products/gemini/bard-gemini-advanced-app https://workspace.google.com/blog/product-announcements/gemini-for-google-workspace https://twitter.com/benthompson/status/1760452419627233610 https://twitter.com/Patworx/status/1760189582870536408 https://twitter.com/micsolana/status/1760163801893339565 https://ai.google/responsibility/principles https://en.wikipedia.org/wiki/Reinforcement_learning_from_human_feedback https://twitter.com/Jason/status/1760780139476992062 https://twitter.com/paulg/status/1760416051181793361 https://twitter.com/chamath/status/1760729719094563019 https://twitter.com/Jason/status/1760780139476992062 #allin #tech #news

Jason CalacanishostChamath PalihapitiyahostDavid Friedberghost
Feb 23, 20241h 20mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:002:34

    Bestie intros: Banana boat!

    1. JC

      All right, everybody. Welcome back to your favorite podcast of all time, the All In Podcast, episode 160 something. With me again, Chamath Palihapitiya. He's a CEO of a company and he invests in startups, and, uh, his firm is called Social Capital. We also have David Freiberg, The Sultan of Science. He's now a CEO as well. And we have David Sacks from Craft Ventures in some undisclosed hotel room somewhere. How we doing, boys?

    2. CP

      Good. Thank you. This is an odd intro.

    3. DS

      Ah, could your intro be any more low energy and dragged out?

    4. JC

      (laughs) I'm sick. What do you want me to do? You want me to drink a-

    5. DS

      Geez, try and fake the effort.

    6. JC

      ... throat lozenge? All right here, give, give me one more shot. Watch this. (clears throat)

    7. DS

      (laughs)

    8. JC

      Watch this. Watch profession- You want professionalism? Here we go.

    9. DS

      Fake the effort, come on.

    10. JC

      Here we go. You want professionalism? I'll show you guys professionalism.

    11. CP

      Is that Binaca? What was that?

    12. JC

      This?

    13. CP

      Is that Binaca?

    14. JC

      Oh, it is a secret.

    15. CP

      (laughs)

    16. JC

      (laughs)

    17. CP

      (laughs)

    18. JC

      Banana boat.

    19. NA

      We're going all in. Let your winners ride. Rain Man David Sacks. We're going all in. And I said we open sourced it to the fans and they've just gone crazy with it. Love you guys. Queen of Quinoa. I'm going all in.

    20. JC

      All right, everybody. Welcome to the All In Podcast, episode 167, 168.

    21. CP

      (laughs)

    22. JC

      With me, of course, the Rain Man himself, David Sacks, the Dictator Chairman, Chamath Palihapitiya, and our Sultan of Science, David Freiberg. How we doing, boys?

    23. CP

      Great.

    24. DS

      Oh, great.

    25. CP

      How are we doing?

    26. JC

      Is that high energy enough for you?

    27. CP

      Yeah.

    28. DS

      Is it 167 or 168?

    29. JC

      I don't know. Who cares?

    30. DS

      Can we at least get you to know the episode number?

  2. 2:3427:26

    Nvidia smashes expectations again: understanding its terminal value and bull/bear cases in the context of the history of the internet

    1. JC

      All right, everybody. We got a lot to talk about today. Apologies for my voice, getting a little bit of a cold. NVIDIA blew the doors off their earnings for the third straight quarter. Shares were up 15% on Thursday, representing a nearly $250 billion jump in market cap. So, let's just let that sit in for a second. This is the largest single-day gain in market cap. $247 billion added in market cap. Previously, META did something similar earlier this year. Remember, everybody was down on that stock because they were doing all the crazy stuff with Reality Labs, and then they got focused and laid off 20,000 people. They added $196 billion. In other words, they added, like, two and a half Airbnbs to their valuation. But let's just get to the results. The results are absolutely stunning and dare I say unprecedented. Q4 revenue, 22.1 billion. That's up 22% quarter over quarter, up 265% year over year. The net income was 12.3 billion, 9X year over year. And the gross margin of 76% was up two points quarter over quarter, 12.7% year over year. But look at this revenue ramp. This is extraordinary. Q1 of 2024, this juggernaut starts, and it does not stop and it doesn't look like it's gonna stop, just a run up from seven billion go all the way to 22 billion in revenue for the quarter. Absolutely extraordinary. And, uh, if you wanna know why this is happening, why is NVIDIA putting up these kind of numbers, this chart explains everything. This is all about data centers. Obviously, if you heard of NVIDIA before the AI boom, it was gaming, professional visualizations, you know, I think people making movies and stuff like that. Autos, uh, used NVIDIA for self-driving, that kind of stuff. But if you look at this chart, you'll see data centers just starting. Four quarters ago, starts to ramp up as everybody builds out the infrastructure for new data centers to deal with generative AI.

    2. DS

      So, just to add one point here, Jason.

    3. JC

      Yeah.

    4. DS

      So, what you can see is that NVIDIA was around for a long time and it was making these chips, these GPUs, as opposed to CPUs, and they were primarily used by games and by virtual reality software, because GPUs are better obviously at graphical processing. They use vector math to create these, like, 3D worlds. And this vector math that they use to create these 3D worlds is also the same vector math that AI uses to reach its outcome. So, with the explosion of LLMs, it turns out that these GPUs are the right chips that you need for these cloud service providers to buil- build out these big data centers to serve now all of these new AI applications. So, NVIDIA was in the perfect place at the perfect time, and that's why this has exploded and the, what you're seeing is the buildout of this new cloud service infrastructure for, for AI.

    5. JC

      Yeah, and, um, i- and also helping the stock is the fact that they bought back 2.7 billion worth of their shares as part of a $25 billion buyback plan. But this company's firing on all cylinders. Revenue's obviously ripping as people put in orders to replace all of the data centers out there, or at least augment them with this technology, with GPUs, A100s, H100s, et cetera. The gross margin's been expanding. They have huge profits.And they're still projecting more growth in Q1, around 24 billion, which would be a 3X increase year-over-year. And this obviously has made the entire market rip. As NVIDIA goes, so does the market right now. And, uh, the S&P 500 and NASDAQ are at record highs at the time of this taping. Chamath, your general thoughts here on something I don't think anybody saw coming, except for you and your investment in Grok maybe, and a couple of others.

    6. DF

      I think what I would tell you is that the bigger principle, and we've talked about this a lot, Jason, is that in capitalism, when you over-earn for enough of a time, what happens is competitors decide to try to compete away your earnings. In the absence of a monopoly, the amount of time that you have tends to be small and it shrinks. So, in the case of a monopoly, for example, take Google, you can over-earn for decades and it takes a very, very long time for somebody to try to displace you. We're s- just starting to see the beginnings of that with things like Perplexity and other services that are chipping away at the Google monopoly. But at some point in time, all of these excess profits are competed away. In the case of NVIDIA, what you're now starting to see is them over-earn in a very massive way. So, the real question is who will step up to try to compete away those profits? The old Bezos quote, right? "Your margin is my opportunity." And I think we're starting to see, and you've mentioned Grok, who had a super viral moment, I think, this week, but you're starting to see the emergence of a more detailed understanding of what this market actually means. And as a result, who will compete away the inference market, who will compete away the training market, and the economics of that are just becoming known to now more and more people.

    7. JC

      Freeberg, your thoughts. We were talking, I think, it was last week or the week before, about the possibility of NVIDIA being a $10 trillion company, the largest company in the world. What are your thoughts on these spectacular results? And then Chamath's point, everybody is watching this going, "Hmm, maybe I can get a slice of that pie and maybe I can create a more competitive offering." Obviously, we saw Sam Altman rumored to be raising seven trillion, which feels like a fake number. It feels like that's maybe the market size or something, but your thoughts here?

    8. CP

      I don't think anything's changed on the NVIDIA front. There's this accelerated compute buildout underway in data centers. Everyone's building infrastructure, and then everyone's trying to build applications and tools and services on top of that infrastructure. The infrastructure buildout is kind of the first phase. The real question ultimately will be, does the initial cost of the infrastructure exceed the ultimate value that's gonna be realized on the application layer? In the early days of the internet, a lot of people were buying Oracle servers. They were like 3,000 bucks a server and they were running these Oracle servers out of an internet-connected data center. And it, you know, took a couple of years before folks realized that for large-scale distributed compute applications, you're better off using cheaper hardware. You know, cheaper server racks, cheaper hard drives, cheaper buses, and assuming a shorter lifespan on those servers, and you could cycle them in and out. And you didn't need the redundancy, you didn't need the certainty, you didn't need the, the runtime guarantees. And so you could use a lower cost, higher failure rate, but much, much net lower cost kind of approach to building out a data center for internet serving. And so the Oracle servers didn't really take the market, and early on everyone thought that they would. So, I think Chamath's point is right. Now, NVIDIA has been at this for a very long time and the real question is how much of an advantage do they have, particularly that there is this need to use fabs to build replacement technology? So, over time, will there be better solutions that use hardware that's not as good, but the software figures out and they build new architecture for running on that hardware in a way that kind of mimics what we saw in the early days of the buildout of the internet? So, um, TBD, right? The same is true in, in, in switches, right? So, in networking, a lot of the high-end, high-quality networking companies got beaten up when lower cost solutions came to market later, and so they looked like they were gonna be the biggest business ever. I mean, you could look at Cisco during the early days of the internet buildout, and everyone thought Cisco was, uh, the picks and shovels of the internet, and they were gonna make all the, all the value's gonna accrue to Cisco. So, we're kind of in that same phase right now with NVIDIA. The real question is, is this gonna be a much harder hill to compete on than we've ever seen? Given the development cycle on chips and the requirement to use these fabs to build chips, it may be a harder hill to kind of get up. So, we'll see.

    9. JC

      Sachs, your thoughts? Do you think, um, we're getting to the point where maybe we'll have bought too many of these, uh, built out too much infrastructure, and it'll take time for the application layer, as Freeberg was alluding to, to monetize it?

    10. DS

      Well, I think the question everyone's asking right now is are, are these results sustainable? Can NVIDIA keep growing at these astounding rates? You know, will the buildout continue? And the comparison everyone's making is to Cisco, and there's this chart that's been going around overlaying the NVIDIA stock price on the Cisco stock price. And you can see here, the orange line is NVIDIA and the blue line is Cisco, and it's almost like a, a perfect match. Now, what happened is that at a similar point in the original buildout of the internet, of the dot-com era, you had the market crash at the end of March of, uh, 2000. And Cisco never really recovered from that peak valuation. Um, but I think there's a lot of reasons to believe NVIDIA is different. One is that if you look at NVIDIA's multiples, they're nowhere near where Cisco's were back then. So, the market in 1999 and early 2000 was way more bubbly than it is now. So, NVIDIA's valuation is much more grounded in real revenue, real margins, real profit.Second, you have the issue of competitive moat. Cisco was selling servers and networking equipment. Fundamentally, that equipment was much easier to copy and commoditize than GPUs. These GPU chips are really complicated. I think Jensen made the point that their Hopper 100 product, he said, you know, "Don't even think of it just like a chip. There's actually 35,000 components in this product, and it weighs 70 pounds."

    11. JC

      Yeah.

    12. DS

      This is more like a mainframe computer or something that's dedicated to processing.

    13. JC

      Yeah, it's somewhere between a rack server and the entire rack. (laughs) Yeah.

    14. DS

      Right.

    15. JC

      It's giant and it's heavy and it's complex.

    16. DS

      Right.

    17. JC

      It does say something here, Chamath, I think, about how well-positioned big tech is in terms of seeing an opportunity and quickly mobilizing to capture that o- opportunity. These servers are being bought by, you know, people like Amazon, I'm sure Apple, obviously Facebook, Meta. I don't know if Google is buying them as well. I would assume so. Tesla. So everybody's buying these things, uh, and they have tons of cash sitting around. It is pretty amazing how nimble the industry is, and this opportunity feels like everybody is looking at it like mobile and cloud. "I have to get mobilized quickly to not get disrupted."

    18. DF

      You're bringing up an excellent point, and I w- I would like to tie it together with Friedberg's point. So, at some point, all of this spend has to make money, right? Otherwise, you're, you're gonna look really foolish for having spent 20 and 30 and $40 billion. So Nick, if you just go back to the, to the revenue slide of Nvidia, I can try to give you a framing of this, at least the way that I think about it. So if, if you look at this, like, what you're talking about is, look, who is gonna spend $22.1 billion? Well, you said it, Jason, it's all big tech. Why? Because they have that money on the balance sheet sitting idle. But when you spend $22 billion, their investors are going to demand a rate of return on that.

    19. JC

      Mm-hmm.

    20. DF

      And so if you think about what a reasonable rate of return is, call it 30, 40, 50%, and then you factor in... And that's profit. And then you factor in all of the other things that need to support that, that $22 billion of spend needs to generate probably $45 billion of revenue.

    21. JC

      Mm-hmm.

    22. DF

      And so Jason, the question, to your point, and to Friedberg's point, the $64,000 question is, who in this last quarter is gonna make 45 billion on that 22 billion of spend? And again, what I would tell you, to be really honest about this, is that what you're seeing is more about big companies muscling people around with their balance sheet and being able to go to Nvidia and say, "I will give you committed pre-purchases over the next three or four quarters," and less about, "Here is a product that I'm shipping that actually makes money, which I need enormous, more compute resources for." It's not the latter. Most of the apps, the overwhelming majority of the apps that we're seeing in AI today are toy apps that are run as proofs of concept and demos and run in a sandbox. It is not production code. This is not, "We've rebuilt the entire autopilot system for the Boeing and it's now run with agents and bots and all of this training." That's not what's happening. So, it is a really important question. Today, the demand is clear. It's the big guys with huge gobs of money. And by the way, Nvidia is super smart to take it, because they can now forecast demand for the next two or three quarters. I think we still need to see the next big thing. And if you look in the past, what the past has showed you, it's the big guys don't really invent the new things that make a ton of money. It's the new guys who, because they don't have a lot of money and they have to be a little bit more industrious, come up with something really authentic and new.

    23. JC

      Yeah, constraint makes for great art.

    24. DS

      Yeah.

    25. DF

      We haven't seen that yet.

    26. JC

      Yeah.

    27. DF

      So, I think the revenue scale will continue for like the next two or three years probably, for Nvidia. But the real question is, what is the terminal value? And it's the same thing that Sax showed in that Cisco slide. People ultimately realized that the value was gonna go to other parts of the stack, the application layer, and as more and more money was accrued at the application layer of the internet, less and less revenue multiple and credit was given to Cisco. And that's nothing against Cisco, because their revenue continued to compound, right? And they did an incredible job, but the valuation got cut.

    28. JC

      So, Freidberg, if we're looking at this chart, the winner of Netflix... No, the winner of the Cisco chart might in fact be somebody like Netflix. They actually got, you know, hundreds of millions of consumers to give them cash.

    29. DF

      Google and Facebook.

    30. JC

      And then you have Google and Facebook as well generating all that traffic, and then YouTube, of course. Who do you see the winner here as in terms of the application layer? Who are the billion customers here who are gonna spend 20 bucks a month, five bucks a month, whatever it is?

  3. 27:2649:37

    Groq's big week, training vs. inference, LPUs vs. GPUs, how to succeed in deep tech

    1. JC

      All right, Groq also had a huge week. That's Groq with a Q, not to be confused with Elon's Grok with a K. Chamath, you've talked about Groq on this podcast a couple of times. Obviously, you were the, I guess you were the first investor, the seed investor. You pulled these LPUs and this concept out of, uh, a team that was at Google. Maybe you could explain a little bit about Groq's viral moment this week and the history of the company, which I know has been a long road for you with this company.

    2. DF

      I mean, it's been since 2016, so it, again, proving what you guys have said many times and what I've tried to live out, which is just you just gotta keep grinding.

    3. JC

      Mm-hmm.

    4. DF

      90% of the battle is just staying-

    5. JC

      Alive?

    6. DF

      ... in business, yeah, and having oxygen to keep trying things. And then eventually, if you get lucky, which I think we did, things can really break in your favor. So this weekend, you know, I've been tweeting out a lot of technical information about why I think this is such a big deal. But yeah, the, the moment came this weekend, combination of Hacker News and some other places, and essentially, we had no customers two months ago. (laughs) I'll just be honest. And between Sunday and Tuesday, we've just, we're overwhelmed.

    7. JC

      Mm-hmm.

    8. DF

      And I think, like, the last count was we had 3,000 unique customers come and try to consume our resources from every important Fortune 500 all the way down to developers. And so I think we're very fortunate. I think the team has a lot of hard work to do, so it could mean nothing, but it has the potential to be something very disruptive. So what is it that people are glomming onto? You have to understand that, like, at the very highest level of AI, you have to view it as two distinct problems. One problem is called training, which is where you take a model and you take all of the data that you think will help train it, and you do that. You train the model. You learn all over all of this information. But the second part of the AI problem is what's called inference, which is what you and I see every day as a consumer. So we go to a website like ChatGPT or Gemini, we ask a question, and it gives us a really useful answer, and those are two very different kinds of compute challenges. The first one is about brute force and power, right? If you can imagine, like, what you need are tons and tons of machines, tons and tons of, like, very high-quality networking, and an enormous amount of power in a data center so that you can just run those things for months. I think Elon publishes very transparently, for example, how long it trains to, to train his Groq with a K, right, model, and it's in the months. Inference is something very different, which is all about speed and cost. What you need to be in order to answer a question for a consumer in a compelling way is super, super cheap.... and super, super fast and we've talked about why that is important. And the Groq with a Q chips turns out to be extremely fast and extremely cheap. And so look, time will tell how big this company can get but if you tie it together with what Jensen said on the earnings call and you now see developers stress testing us and finding that we are meaningfully, meaningfully faster and cheaper than any NVIDIA solution, there's a potential here to be really disruptive. And we're a meager unicorn, right?

    9. JC

      Mm.

    10. DF

      Our last valuation was like a billion something versus NVIDIA which is now like a two trillion dollar company. So there's a lot of market cap for Groq to gain by just being able to produce these things at scale, which could be just a, an enormous outcome for us. So time will tell, but a really important moment in the company and very exciting.

    11. CP

      Can I just observe like off-topic how an overnight success can take eight years? (laughs)

    12. JC

      Yeah. No, I was thinking the same line. It's a seven-year overnight success in the making.

    13. CP

      There's this class of businesses that I think are unappreciated in a post-internet era where you have to do a bunch of things right before you can get any one thing to work. And these complicated businesses where you have to stack either different things together that need to click together in a, in a stack or you need to iterate on each step until the whole system works end-to-end can sometimes take a very long time to build. And the term that's often used for these types of businesses is deep tech. And they fall out of favor because in an internet era and in a software era, you can find product market fit and make revenue and then make profit very quickly. And so a lot of entrepreneurs select into that type of business instead of selecting into this type of business where the probability of failure is very high, you have several low probability things that you have to get right in a row, and if you do, it's gonna take eight years and a lot of money, and then all of a sudden the thing takes off like a rocket ship, you've got a huge advantage, you've got a huge moat, it's hard for anyone to catch up and this thing can really, um, spin out on its own. I do think Elon is very unique in his ability to deliver success in these types of businesses. Tesla needed to get a lot of things right in a row, SpaceX needed to get a lot of things right in a row. All of these require a series of complicated steps or a set of complicated technologies that need to click together and work together, but the hardest things often output the highest value. And, you know, if you can actually make the commitment on these types of businesses and get all the pieces to click together, there's an extraordinary opportunity to build moats-

    14. JC

      Mmm.

    15. CP

      ... and to take huge amounts of market value. And I think that there's a, an element of this that's been lost in Silicon Valley over the last couple of decades as the fast money in the internet era has kind of prioritized other investments ahead of this. But I'm really hopeful that these sorts of chip technologies, SpaceX, in biotech we see a lot of this, these sorts of things can kind of become more in favor because the, the advantage as these businesses work seems to realize hundreds of billions and sometimes trillions of dollars of market value and be incredibly transformative for humanity. So I don't know, I just think it's an observation I wanted to make about the greatness of these businesses when they work out.

    16. DS

      Well, I mean, OpenAI was kind of like that for a while.

    17. CP

      Totally. Totally.

    18. DS

      I mean, it was this like wacky nonprofit that was just grinding on an AI research problem for like six years and then it finally worked and got productized into ChatGPT.

    19. CP

      Totally.

    20. DS

      But you're right, SpaceX was kind of like that. I mean, the big money maker at SpaceX is Starlink, which is the satellite network. It's basically broadband from space and it's on its way to handling I think a meaningful percentage of all internet traffic. But think about all the things you had to get to to get that working. First, you had to create a rocket. That's hard enough. Then you had to get to reusability. Then you had to create the whole satellite network. So at least three hard things in a row.

    21. CP

      Well, and then you have to get consumers-

    22. DS

      And Tesla was kinda the same way.

    23. CP

      ... to adopt it. I mean, you know.

    24. DS

      Right.

    25. DF

      Yeah.

    26. DS

      Don't forget the final step. Yeah.

    27. DF

      We had no idea where the market was. Like early on it started in my office and so Jonathan and I would be kind of always trying to figure out what is the initial go-to market. And I remember I emailed Elon in, at that period when they were still trying to figure out whether they were gonna go with LIDAR or not, and we thought, "Wow, maybe we could sell Tesla the chips." You know, but, and then Tesla brought in this team just to talk to us about what the design goals were and basically said no, in kind way but they said no. Then we thought, "Okay, maybe it's like for high frequency traders," right? Because like those folks wanna have all kinds of edges and if we have these big models maybe we can accelerate their decision-making, they can measure revenue. That didn't work out. Then it was like, you know, we tried to sell to three-letter agencies. That didn't really work out. Our original version was really focused on image classification and convolutional neural nets like ResNet. That didn't work out. We ran headfirst into the fact that NVIDIA has this compiler product called CUDA and we had to build a high-class compiler that you could take any model without any modifications. All these things, to your point, are just points where you can just very easily give up, and then there's like, "We ran out of money." So then you write money in a note, right? Because everybody wants to punt on valuation when nothing's working. (laughs)

    28. DS

      Yeah. Yeah. You tried six

    29. DF

      Oh my gosh.

    30. DS

      ... beachhead market, you couldn't land the boat. Right.

  4. 49:371:17:17

    Google's AI disaster: Is Google too woke to function as search gets disrupted by AI?

    1. JC

      it was a very big week for Google, not in a great way. They had a massive PR mess with their Gemini, which refused to generate pictures, if I'm reading this correctly (laughs) , of white people. Here's a, a quick refresher on what Google's doing in AI. Gemini is now Google's brand name for their AI main language model. You can think of that like OpenAI's GPT. Bard was the original name of their chatbot. They had Duet AI, which was Google Sidekick. In the Google suite earlier this month, Google rebranded everything to Gemini. So Gemini is now the model, it's the chatbot, and it's a sidekick, and they launched a $20 a month subscription called Google One AI Premium. Uh, only four words. (laughs) Way to go. This includes access to the best model, Gemini Ultra, which is on par with GPT-4 according to them, uh, and generally in the marketplace. But earlier this week, users on X started noticing that Gemini (laughs) would not generate images of white people even when prompted. People were prompting it for images of historical figures that were, uh, generally white and getting kind of weird results. I asked Google Gemini to generate images of the Founding Fathers. It seems to think George Washington was Black. Certainly, here's a portrait of the Founding Fathers of America. As you can see, it is putting-

    2. DF

      (laughs) There's an Asian guy. That's awesome.

    3. JC

      Yeah, it's just, it's making a great mashup.

    4. DF

      Mm-hmm.

    5. JC

      And, uh, yeah, we, there was, like, countless images that got created. "Generate images of the American Revolutionary..." Sure is, "Here are images featuring diverse American Revolutionaries," and inserted the word diverse. Sax, I'm not sure if you watched this controversy on X. I know you spend a little bit of time on that social network. I noticed you're, you're active once in a while. Did you log in this week and, and see any of this brouhaha?

    6. DS

      Sure. It's all over X right now. I mean, look, this Gemini rollout was, was a joke. I mean, it's ridiculous. The AI isn't capable of giving you accurate answers because it's been so programmed with diversity and inclusion. And it inserts these words diverse and inclusive even in answers where you haven't asked for that, you haven't prompted it for that. So they, I think Google has now, like, yanked back the product release. I think they're scrambling now because it's been so embarrassing for them.

    7. DF

      But Sax, like, is, is it... How does this not get QA'd? Like, I don't-

    8. DS

      (laughs)

    9. DF

      ... understand how-

    10. JC

      Yeah, how did the red team not catch this? Yeah.

    11. DF

      Well, how, or anybody, or isn't there a product review with senior executives before this thing goes out that says, "Okay, folks, here it is. Have at it, try it. We're really proud of our work." And, and then they say, "Well, hold on a second, is this actually accurate? Shouldn't it be accurate?"

    12. CP

      You guys remember when ChatGPT launched and there was a lot of criticism about Google and Google's failure to launch? And a lot of the observation was that Google was afraid to fail or afraid to make mistakes, and therefore they were too conservative. And as you know, in the last year to year and a half, there's been a strong effort at Google to try and change the culture and move fast and push a product out the door more quickly. And the criticism is now why Google (laughs) has historically been conservative. And I realize we can talk about this particular problem in a minute, but it's ironic to me that the Google-is-too-slow-to-launch criticism has now revealed that Google's result of actually launching quickly can cause more damage than (laughs) , than good.

    13. DS

      But Google did not launch quickly.

    14. CP

      Well, I will say one other thing. I, it seems to me ironic because I think that what they've done is they've launched more quickly than they otherwise would have, and they've put more guardrails in place that, that backfired. And those guardrails ended up being more damaging.

    15. DF

      But what are the guardrails? What's the guardrail here?

    16. CP

      So this is Google's AI principles. The first one is to be socially beneficial. The second one is to avoid creating or reinforcing unfair bias. So much of the effort that goes into tuning and weighting the models at Gemini has been to try and avoid stereotypes from persisting in the output that the model generates.

    17. DS

      Whereas-

    18. JC

      Telling the truth.

    19. DS

      ... telling the tru- Exactly. That's exactly what I was gonna say.

    20. JC

      Yeah. Changing society-

    21. DS

      Where is that value?

    22. JC

      ... is our second principle. (laughs) We'd like to steer a society to be better.

    23. DS

      No, I think socially beneficial is a political objective because it depends on how you perceive what a benefit is. Avoiding bias is political. Be built and tested for safety doesn't have to be political, but I think the meaning of safety has now changed to be political. By the way, safety with respect to AI used to mean that we're gonna prevent some sort of AI superintelligence from evolving and taking over the human race. That's what it used to mean. Safety now means protecting users from seeing the truth.

    24. CP

      I feel unsafe (laughs) .

    25. JC

      You feel unsafe?

    26. DS

      Yeah, because they might, they might feel unsafe, or, you know, somebody else d- uh, defines it as a violation of safety for them to see something truthful. So the first three, their first three objectives or values here are all extremely political.

    27. DF

      I think any AI product, for it to be worth its salt, has to start... They can have any... I, I think that...... these values are actually reasonable, that's their- that's their decision, they should be allowed to have it. But the first base order principle of every AI product should be that it is accurate and right.

    28. JC

      Correct? (laughs)

    29. CP

      Yeah.

    30. DF

      Yeah.

  5. 1:17:171:17:35

    War Corner with Sacks

    1. DS

      George Washington.

    2. JC

      Okay, everybody. We're gonna go by chopper.

    3. CP

      W- war cor- wait, war corner. We need a war corner. (laughs)

    4. JC

      We're gonna go by chopper to the... We have our war correspondent, General David Sacks in the field. Uh, we're dropping him off now.

    5. CP

      (laughs)

    6. JC

      David Sacks in the helicopter.

    7. CP

      (laughs)

    8. DS

      (laughs)

    9. JC

      Go and tell us what's going on in the Ukraine on the front.

Episode duration: 1:20:26

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode z6vrKA_L5pk

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome