No PriorsNo Priors Ep. 96 | With Modal CEO and Founder Erik Bernhardsson
EVERY SPOKEN WORD
55 min read · 11,478 words- 0:00 – 0:22
Introduction
- EGElad Gil
(electronic music) Today, I'm chatting with Erik Brynjarsen, founder and CEO of Modal. Modal developed a serverless cloud platform tailored for AI and machine learning in data applications. And before that, Erik worked at Better.com and Spotify, where he led Spotify's machine learning efforts and built the recommender system. Well, Erik, thanks so much for joining me today on No Priors.
- EBErik Bernhardsson
Yeah.
- 0:22 – 1:22
Erik's early interest in ML infra
- EBErik Bernhardsson
Thanks. It's great to be here.
- EGElad Gil
So, um, if I remember correctly, you worked at Spotify and helped build out their ML team and recommender system, and then were also at Better.com. What inspired you to start Modal, and what problem were you hoping to solve?
- EBErik Bernhardsson
Yeah. Uh, so I started at Spotify a long time ago, 2008, and I spent seven years there. And yeah, I, I built a music recommendation system. And back then, there was, like, nothing really, uh, in terms of data infrastructure. Hadoop was, like, the most modern thing. And so I, I spent a lot of time building a lot of infrastructure. Uh, in particular, I built a workflow scheduler called Luigi that basically no one uses today. I built a vector database that... called Innoai that, you know, for a brief period, people used, but no one really uses today. Uh, so I spent a lot of time building a lot of that stuff. Uh, and then later at Better, I was the CTO and thinking a lot about, like, developer productivity and stuff. And then during the pandemic, I, I took some time off and started hacking on stuff, and I realized I always wanted to build, uh, basically a better infrastructure for, for these types of things, like data AI and machine learning. So, so pretty quickly realized, like, this is what I wanted to do, and that, that was sort of the genesis of Modal.
- 1:22 – 4:17
Founding Modal Labs
- EBErik Bernhardsson
- EGElad Gil
Mm-hmm. That's cool. How did that approach evolve, or what are the main areas that the company focuses on today?
- EBErik Bernhardsson
So I started looking into, first of all, just, like, what are the challenges with data AI and machine learning infrastructure? And, and I started thinking about from, like, a developer productivity point of view. What's a tool I want to have? And, and I realized, like, a big, a big sort of challenge is, like, working with the cloud is arguably kind of annoying (laughs) and, like, as much as, like, I love the cloud for the power that it gives me, and I've used the cloud since, you know, way back, 2009 or so, uh, it's actually pretty frustrating to work with. And so I, I... in my head, I had this idea of, like, what if you make cloud development feel almost, like, as good as local development, right? Like, it has its, like, fast feedback loops. And so I started thinking about, like, how do we build that? And realized pretty quickly, like, well, actually, we can't really use Docker and Kubernetes, so we're gonna have to throw that out and, and, and probably gonna have to build our own file system, which we did pretty early, and build our own scheduler and build our own container runtime. And so that was, like, basically the two first years of Modal, is just, like, laying all that, like, foundational infrastructure layer in place.
- EGElad Gil
Yeah. And then, um, in terms of the things that you offer today for your customers, what are the main services or products or...?
- EBErik Bernhardsson
Yeah. So, so we're an infrastructure as a service, so... which means, like, on one side, we, we run a very big compute pool, like, thousands of GPUs and CPUs, and, and we make it very easy to get. You know, if you need 100 GPUs, we can typically get you that within seconds. Uh, so, so sort of one big multi-tenant pool, which means, like, capacity planning, uh, is, is something we kind of take, you know... It's, it's something we solve for customers. They don't really need to, to think about reservations. We always provide a lot of on-demand GPUs. Uh, on the other side, there's a Python SDK that makes it very easy to, to build applications. And so the idea is, like, you write code, basically, like functions in Python, uh, and then we take those functions, turn them into serverless functions in the cloud. We handle all the containerization and all the infrastructure stuff, so you don't have to think about all this sort of Kubernetes and Docker stuff. And the, the, the real killer app, as it turns out... Like, we started this company pre-gen AI, but as it turns out, the, the, the main thing that really started driving all the traction was, was when stable diffusion came out. And a bunch of people came to us and were like, "Hey, actually, this looks kind of cool. Like, you have GPU access. Uh, it's very easy to... You know, you don't have to think about, you know, spinning up machines and provisioning them." And so that was, like, our first sort of killer app, was, like, just doing gen AI in a serverless way, uh, with a focus on diffusion models. Like, now, we actually... we have a lot more of a different modalities, like a lot of... A lot of usage is still, like, text to image, but we also see a lot of audio and music. So, so one example of a, a customer I think is super cool, building really amazing stuff, is Suno, which does, uh, AI-generated music. So they run all their inference on Modal, uh, very large scale. Uh, there's, there's a lot of customers like that sort of dealing with, like, you know, building cool gen AI, uh, models, uh, in particular, I would say, in the modalities of, like, audio, video, image, and, and music, stuff like that.
- EGElad Gil
That's cool. And I think Suno's using all, like, transformer backbone now for stuff, right? Versus the diffusion model-based thing, so...
- EBErik Bernhardsson
I think it's... I think it's a combination of both.
- EGElad Gil
Mm-hmm.
- EBErik Bernhardsson
Uh, I'm not sure though (laughs) .
- EGElad Gil
Yeah, yeah. I think they talk about it
- 4:17 – 7:14
State of GPU use today and what’s to come
- EGElad Gil
publicly. That's the only reason I mention it. But you wrote, uh, in October, this post. I think it was called "The Future of AI Needs More Flexible GPU Capacity." And in general, what I've heard in industry is that a lot of, um, ways that people use GPU is reasonably wasteful. And so I'm a little bit curious about your view on, um, flexibility around GPU use, how much is actually used versus wasted, how much optimization is left, you know, even just with existing types of GPUs that people are using today.
- EBErik Bernhardsson
Yeah. I mean, uh, GPUs are expensive, right? And, and I think it's sort of kind of as, like, a paradox. It's like... means that cloud... You know, a lot of the cloud capacity is, like, y- y- they... You know, the only way to get it is to sign long-term commitments, uh, whi- which I think for a lot of startups is really not the right model for how things should be. Like, I think the p- the, the amazing thing about the cloud was always, to me, that, like, you have on-demand access to, like, whatever many CPUs you need. But for GPUs, like, the main way to get access has been, over the last few years, due to the scarcity, has been to, to sign long-term contracts. And I, I think fundamentally, that's just not how startups should do it, right? Like, you know, and I kind of get it. It's been s- sort of supply-demand issues, but, but, um, just, you know, looking at the CPU market, the, the fact that you have, like, instant access to thousands of CPUs if you need it. Like, my, my vision has always been this should be the same thing for GPUs. Uh, and, and that means, you know, especially as we shift more to inference, I think for training, it's been sort of a less of an issue 'cause, like, you can sort of just, like, make use of the training resources you need. But for inst- inference especially, you don't even know how much you need, right, like in advance. Like, it's very volatile. Uh, and so a big challenge that, that we solve for a lot of customers is we, we're fully usage-based. So when you run things on Modal, we charge you only for the time the container's actually running, uh, and, uh, and that's a, that's a... It's a massive hassle for, for customers' traditions, like doing the capacity planning and thinking how many GPUs. And then, and then having the issue of, like, either you over-provision and you, you're paying for a lot of idle capacity, or you under-provision and then you have, you know... wh- when you, you run into capacity shortage, like, you have degr- degradation in service. And so...... whereas with Modal, we can handle these like very bursty, very unpredictable workloads really well, 'cause we basically like take all these user workloads and just run a big pool of thousands of GPUs across many different customers.
- EGElad Gil
Yeah. Uh, one of the things that always struck me about training is, to your point, you kind of spin up a giant cluster, you run a huge supercomputer, right? And then you run it for months in some cases, and then your output is a file.
- EBErik Bernhardsson
(laughs) Yeah.
- EGElad Gil
And that's literally what you've (laughs) generated, you know? It's kind of insane if you think about it.
- EBErik Bernhardsson
Yeah.
- EGElad Gil
Um, and that file in some sense is a representation (laughs) of the entire internet or s- some corpus of human knowledge or whatever. And then to your point, with inference you need a bit more flexibility in terms of spinning things up and down, or alternatively, if you're doing shorter training runs or certain aspects of post-training, you may need more flexible capacity to deal with.
- EBErik Bernhardsson
Totally. And, and that's something we're really interested in right now. Like traditionally, most of Modal has always been inference, like that's been our main use case, but we're really interested also in training. So... and, and in particular, like probably focus more on these like shorter, like very bursty sort of-
- EGElad Gil
Mm-hmm.
- EBErik Bernhardsson
... experimental training runs. Not the like very big training runs, 'cause I think that's a very different market.
- EGElad Gil
Mm-hmm.
- EBErik Bernhardsson
Uh, so that's like a very interesting-
- 7:14 – 9:00
Modal's end-to-end vision
- EBErik Bernhardsson
- EGElad Gil
How do you think about...
- EBErik Bernhardsson
... thing we're, we're after. Sorry.
- EGElad Gil
H- how do you think about meeting people's end-to-end needs? So I know that there's a lot of other things that people do. There's... you know, a lot of people are using RAG, uh, to basically, um, augment what they're doing or, uh, you know, there's a variety of different things that people are now doing at time of inference in terms of using compute to, um, you know, take different approaches there. You know, uh, I'm a little bit curious how you think about the end-to-end stack of things that could be provided as infrastructure and where Modal focuses or wants to focus.
- EBErik Bernhardsson
Yeah, totally. I, I mean, our goal has always been to build a platform and, and cover like the end-to-end use case. It just turned out that inference was... we were well-positioned to, to focus on that as our first killer app. But, but my, my end goal has always been to, to make engineers more productive and, and focus on what I think of as like the high-code side of ML. Like, we're... I think we're... like, our target audience tends to be more like sort of traditional, like ML engineers, like people building their own models, but there's many different aspects of that. There's like the data pre-processing, then there's the training, and then there's the inference, and then there's actually probably like even more things, right? Like, you know, uh, having feedback loops where you gather data and like, you know, online ranking models and all these things. And so th- my goal for Modal has always been to cover all of that stuff. And, and so it, it's interesting, you see a lot of customers now, uh, we don't have a training product, but a lot of customers use Modal for batch pre-processing. So they, they use Modal to, you know, uh, maybe they're training a video model, so maybe they have like petabytes of video. So then they use Modal actually, maybe with GPUs even, to like do feature extraction, and then they train it elsewhere, and then they come back to Modal for the inference. Uh, so for, so for us to do training makes a lot of sense. And in general, I, I think there's a lot of... you know, it makes a lot of sense to sort of build a platform where you can handle the entire sort of machine learning life cycle end-to-end, and, and many other things, uh, related to that. Also the data pipelines and nightly batch jobs and all these
- 9:00 – 10:20
Differentiating amongst competition
- EBErik Bernhardsson
things.
- EGElad Gil
Yeah. I mean, what you described is a pretty broad platform-based approach. Um, I think there's a handful of companies who are s- sort of in your general space or market. How do you feel that Modal differentiates from them?
- EBErik Bernhardsson
I, I think, first of all, we, we're cloud native. Like, we're just like cloud maximalists. Like, we went all in and decided like, basically we're gonna build a multi-tenant platform that runs everyone's compute in the sa-... And, and the benefits of that are like very tremendous, 'cause like we could just do capacity management much better. Uh, a- and that's one of the ways we can offer like instantaneous access to hundreds of GPUs if you need to. Like, you can do these like very bursty things and we just give you lots of GPUs, right? I, I think the other benefit or the other sort of differentiation is we have very general purpose. Uh, we focus on sort of what I think as I mentioned, like high code, like we run custom code in our containers, in our infrastructure, w- which is a harder problem. Like, containerization and running user code in a safe way is a hard problem. And, and then dealing with container cold start, and like I mentioned, we had to build our own scheduler, we had to build our own container runtime and our own file system to boot containers very quickly. Uh, a- and I think so... unlike many other vendors, like they're only focused on say, inference or maybe only LMs, uh, our, our approach has always been to build a very general purpose platform, and, and, and sort of, you know... I- in the long run, I, I hope to sort of... that, that, that sort of manifestation will be more clear, 'cause I think there's many other products we can build on top of this now that we have the compute layer sort of,
- 10:20 – 12:35
Cloud vs on-premise
- EBErik Bernhardsson
kind of becoming more and more mature.
- EGElad Gil
When I talk to large enterprises about how they're thinking about adoption of AI, many of them already have their data on Azure or GCP or AWS, they're running their application on it. They've bought credits in the marketplace, they wanna spend resident, they've already gone through security reviews, you know, they've, they've kind of done a lot and they worry about things like, um, latency or pings out to other third-party services versus just running on their own existing cloud provider or their hyperscaler that they work with, or set of hyperscalers, you know, many of them actually, uh, work across multiple. How do you think about that in the context of Modal in terms of your own compute versus hyperscalers versus, you know, the ability to run anywhere?
- EBErik Bernhardsson
Yeah, totally. And, and, and of course there's also a sort of security compliance aspect of this. Like, I, I, I think, you know, it is a, it is a, you know, challenge. Uh, I, I look back at when the cloud came, and I remember back in like 2008, 2009, and the cloud came, and my first reaction was like, "How the hell..." Like, "Why, why would anyone put their compute in someone else's computer (laughs) and like run that?" And, and I think, you know, to me that was just like insane, like why would anyone do that? But o- over the next couple years I realized like, actually it kind of makes a lot of sense. And, and I think now even like among like enterprise companies, like there's a sort of recognition that like, yeah, actually probably our compute is more safe in the big hyperscalers. And in a similar, similar vein, I remember talking to Snowflake back in, say, 2012 or something like that, and, and they had a sort of similar approach where like they basically said like, "We're gonna run databases in the cloud, and it's not gonna be in your envi- you know, or, or maybe in your environment, but like, we're in infrastructure as a service." And I, I thought that was nuts. And then obviously, like, I think Snow- Snowflake now is a very large, you know, publicly traded company. I think they showed that like, infrastructure as a service makes a lot of sense. And so I, I, I think there is a little bit of resistance to, to adopting this like multi-tenant model. I th- but I think, you know, when you look at like security and adoption of cloud, I, I think we have a lot of tailwinds blowing in our direction. I think security is moving away from sort of a network layer into, um...... into a, a application layer. Uh, I, I think bandwidth costs are coming down. I think there's a lot of tricks you can do to, to minimize bandwidth, transfer costs. Uh, you can store data in, like, R2 for instance, which has zero egress fees. It's something that I think is realistically gonna, you know, mean we're gonna have to push a lot to... But I think there's so many benefits of this multi-tenant model in terms of capacity management that, to me, it is very clearly,
- 12:35 – 13:20
Popular AI models
- EBErik Bernhardsson
like, a big part of the future of AI, is, like, running a big pool of compute and slicing it very dynamically.
- EGElad Gil
Mm-hmm. You, you mentioned earlier that, um, one of the things that really caused early adoption of Modal was, um, stable diffusion and sort of these, these open source models around image gen. Are, are there any open source projects or models that you're seeing be, um, v- very popular in recent days or in the last couple of months that have really started taking off?
- EBErik Bernhardsson
That's a good question. I, I think if anything has actually been a little bit of a shift towards more, like, proprietary models. Um, but, but, like, proprietary open source models, I guess, so. Like Flux, I think most recently has been, uh, you know, a, a, a model that's, uh, getting a lot of attention. I'm personally very interested in, like, audio. I think audio is, like, very underexplored. Uh, I think there's a lot of opportunity
- 13:20 – 14:55
Gaps in AI infrastructure
- EBErik Bernhardsson
for open source models i- in that space. Uh, but I, I don't think we've seen anything really cool yet.
- EGElad Gil
Mm-hmm. What else do you think is missing in the world today in terms of AI infrastructure or infrastructure as a service? So...
- EBErik Bernhardsson
I'm, uh, very biased, but I think Modal is missing (laughs) . Like, basically a way to, like... for, for engineers to, to take code and, and run it. And, and look, I, I'm very bullish on, like, you know, code and, like, people wanting to write code and, and building stuff themselves. I, I think outside of sort of LLM space wh- which is, like, a very kind of a d- different world, in my opinion, I think there's always gonna be a lot of applications where people wanna train their own models, they wanna run their own models or, or at least, like, run other models but have, like, very custom workflows. Uh, and it... and I just don't think there's been a great way to do that. It's, like, pretty painful to do that. And, and so I, I think that's pretty exciting. I, I think on the storage side, there's some other really exciting stuff. Like, we, we haven't really touched storage at Modal. Like, we, we focus very much on compute. So I, I'm personally very interested in sort of vector database, like, how's that gonna evol- evolve? I th- I don't think anyone really knows. Um, I'm pretty interested in, like, you know, more efficient storage around training data. I'm also very interested in, like... I, I guess another thing I'm, I'm, I'm very fascinated by right now is, um, uh, training workloads. Uh, in order to, to train large models efficiently, you have to really spend a lot of money and, and time setting up the networking. So one of the things I'm really excited about is what if you don't, you know... what if we can make training less bandwidth hungry? 'Cause I think that would actually change a lot of the infrastructure around training, um, where you, you can now, like, kind of tie together a lot of GPUs in different data centers, uh, and, um, and, and not have to, you know, have this, like, very large data centers with, like, you know, InfiniBand
- 14:55 – 16:48
Insights on vector databases
- EBErik Bernhardsson
and stuff. So th- that's, like, another sort of infrastructure thing I, I'm, I'm looking forward to seeing more development on.
- EGElad Gil
How important... Um, so the- there's, there's sometimes been a little bit of debate around vector DBs and you mentioned that you, uh, that you actually built one when you were at Spotify. And I think Spotify today hit $100 billion in market cap. I think it's one of the first European technology-
- EBErik Bernhardsson
Yeah.
- EGElad Gil
... companies to get there, um, which is pretty cool.
- EBErik Bernhardsson
Yeah.
- EGElad Gil
So a lot of, uh, folks I know, um, may use one of the existing vector DBs or in some cases are just using PostgreSQL with, um, with pgvector, right? How do you think about the need for vector databases as sort of standalone pieces of infrastructure versus just, you know, adopting PostgreSQL versus doing something else?
- EBErik Bernhardsson
Yeah, I feel like everyone's debating that. I, I don't know necessarily. Like, I, I think there's a lot of... there's a case to be made that, you know, you can just stick everything into relational database and, and you're f- you're fine. To me, like, the, the bigger question is, like, in the long run, like, you know, if you think about, like, what's, like, an AI native data storage solution? Like, I don't even know if it's, like, necessarily it has the same form factors and the same interface as, as a database. So, eh, that's actually a bigger question that I'm more excited about is, like, I, I think people look at, like, vector databases and, like, you know, wh- whether it's relational or not, they sort of shoehorn it into, into this, like, you know, sort of old school model of, like, you, you put data, you get data back. But I don't know, I, I think there's, like, eh, a lot of room to sort of rethink that in the age of AI and have very different, like, you know, interaction models with that data. I know that sounds-
- EGElad Gil
Mm-hmm.
- EBErik Bernhardsson
... a little fluffy.
- EGElad Gil
Yeah, it's super interesting. Could you say more on that?
- EBErik Bernhardsson
I mean, like, o- one thing I, I think a lot about is, like, maybe the database itself be, like, the embedding engine, right? Like, instead of, like, you put a vector in and you, you, you know, you search by that vector, I, I think there's a lot of, you know... it... the more native, like, AI native storage solution would be you put text in, you put, you know, video in (laughs) , you put image in, and then you can search by that. Like, eh, to me that would be, like, a more sort of native, AI native sort of storage solution. So that's, like, one line of thought that I've had is, like, maybe we just... we're just, like, so early to this that, like,
- 16:48 – 17:47
Training models vs off-the-shelf models
- EBErik Bernhardsson
I, I think it's gonna take five, 10 years for it to really... for it, for it to shape out.
- EGElad Gil
Yeah, that's really cool. I, I guess one other thing that you mentioned was more people seem to be training their own models, at least in a lot of the areas that, that Modal works with. Um, w- do you think there's any heuristic that people should follow in terms of when to train their own model versus use something off the shelf?
- EBErik Bernhardsson
I think eventually, like, for any company where model quality really matters, unless you kinda train your own model in the end, like, I feel like it's gonna be hard to sort of defend the fact that, like, you know, you have a better solution. 'Cause, like, otherwise, like, what's your moat? Like, if you don't have your own model, like, you need to find some... a moat somewhere else in the stack. And, and that might be possible to find. Uh, it might be somewhere else for a lot of companies. But I think at least if you have your own model and that model clearly is better than anyone else, uh, then that sort of inherently is a moat in itself. I think it's more clear in the... in... outside of the LM space when, when people are building audio, video, image models. Uh, I think if, if that is your core
- 17:47 – 22:14
AI’s impact on coding and physics
- EBErik Bernhardsson
focus, like, v- it's very clear to me, like, you kinda have to train your own models in that case.
- EGElad Gil
Mm-hmm. Yeah. Um, I... if I remember correctly, you were... you're an IOI, uh, gold medalist.
- EBErik Bernhardsson
Yeah, that's right. (laughs)
- EGElad Gil
And obviously you think a lot about code and coding. And how do you think that changes with AI over time? Or do you have any contrarian predictions on, on what happens there?
- EBErik Bernhardsson
I don't know if this is contrarian, but, like, I actually think that, like, you know, this is just, like, one out of many improvements in developer productivity. And, you know, you look back at, like, you know...... whatever, like compilers (laughs) was, was originally, like, you know, a tool that made developers more productive and then, like, higher level programming languages and databases and cloud and all these things. And so, like, I actually don't know if, like, AI is, like, you know, different than any of those changes in the hindsight. And, and so, and, and, and by the way, like, every time that's happened, you know, it, it turns out, like, there's so much latent demand for software that actually, like, the number of software engineers goes up. So, like, I, I feel like you look back at, like, you know, the last, like, 40 years of software development, like, every decade, engineers get, like, 10 times more productive due to better frameworks or better, you know, tooling or whatever. And, and it turns out, actually, that just unlocks more latent demand for software engineers. So, I, I'm very bullish on software engineers. I, I think it would take a lot to, sort of, destroy that demand. Uh, I think people look at a lot of, like, AI as, like, a kind of fix something but, but i- in my opinion it's like, no, it's good, it's just gonna unlock more latent demand for more things. So, I, I'm very bullish on software engineering.
- EGElad Gil
Mm-hmm. And then, I guess the other field that you touched a long time ago was, um, I think you won a Swedish physics competition in high school.
- EBErik Bernhardsson
(laughs) Yeah, yeah.
- EGElad Gil
And I'm curious if you followed any of the physics-based AI models or some of the simulation re- Like, that's an area that strikes me as very interesting-
- EBErik Bernhardsson
Uh, I, I-
- EGElad Gil
... and the way you think about the models for it are different and... Yeah.
- EBErik Bernhardsson
I, I, uh, I did win, uh, the, the Swedish high school physics competition. I was a total mathlete nerd when I was, uh, you know, in my teenagers.
- EGElad Gil
Mm-hmm. Okay. Yeah, I think it's a really fascinating area right now. Like, it's one of those areas that seems r- um, like there's some real reinvention needed and not as many people working on it, so it's one of the areas I'm kind of excited about, just in terms of there's, there's lots and lots of different applications that you can start to come up with relative to it.
- EBErik Bernhardsson
Yeah. I think it's, I mean, like, physics, uh, in my opinion, it's like, es- you know, look back at, like, the golden era of physics, like the '20s and '30s and '40s, I, I kind of feel like it's like hasn't really evolved much, the field. So, I don't know, may- I, I would love for you to be right, that there's, like, a resurgence of, you know, new physics-based models.
- EGElad Gil
Yeah, I don't know if it would necessarily help in the short run with basic research. I think it just helps with simulation. It kind of feels like physics as a field, um, really kind of doubled down on, sort of, the ad- ad witten path of physics and maybe got a little bit lost there or something. I'm not sure. It's kind of beginning-
- EBErik Bernhardsson
Are you talking about more, like, material... Like, doing more, like, compute-based methods for physics?
- EGElad Gil
It's kind of like Ansys or other companies where, you know, you simulate an airplane wing, you simulate load-bearing in a-
- EBErik Bernhardsson
Oh, I see.
- EGElad Gil
... in a-
- EBErik Bernhardsson
So, like, high HPC.
- EGElad Gil
Correct, yeah.
- EBErik Bernhardsson
I mean, that's always existed, right? Like, especially in, like, uh, you know, uh, oil and, and gas engineering-
- EGElad Gil
Yeah, exactly, yeah.
- EBErik Bernhardsson
... stuff like that, and-
- EGElad Gil
But it's a lot of kind of small, bespoke, kind of, fine-tuned or hand-tuned models for specific things versus-
- EBErik Bernhardsson
Yeah.
- EGElad Gil
... you know.
- EBErik Bernhardsson
I mean, meteorology is, like, something I actually think, like, deep learning should, like, change, right?
- EGElad Gil
Mm-hmm.
- EBErik Bernhardsson
Like, it sort of makes a lot of sense. Like, you're, you know, uh, uh, deep learning should be very good at, like, predicting, you know, turbulence and things like that. Like, th- 'cause those, turbulence-
- EGElad Gil
Yeah, totally.
- 22:14 – 23:36
AI's impact on music
- EBErik Bernhardsson
uh, which is kind of cool.
- EGElad Gil
It's really cool. Is there, um, any area that you're most excited about from a human impact perspective for some of these models?
- EBErik Bernhardsson
You know, with my background at Spotify, like, I, I, I think Suno is, like, to me, very exciting thing. Uh, I, I think it's still, like, very early, sort of AI-generated music. You can still f- hear that it's, like, not right. It's a sort of uncanny valley a little bit. Uh, but, but, like, Suno is, like, every generation of their model is, like, getting better and better. And first of all, like, music in itself tends to be, like, sort of always, like, one of the first areas where you see real impact of new technologies, whether, you know, Spotify or, like, iTunes or piracy or, like, all these things, or gramophones going back, right? Uh, so, I, I always think of music as, like, an exciting era for that, in that sense. Like, it always shows, like, the, the opportunity of new technologies. Um, and, and I also think, like, Suno's, like, fundamentally something you couldn't have done before gen AI. So, that, to me, is, like, really exciting. It's, like, sort of really pushing the frontier, enabling a completely new product that there's, like, there's no way this Suno could have existed five years ago.
- EGElad Gil
Mm-hmm. That's cool. Well, uh, I think we covered a lot today. Thanks so much for joining me.
- EBErik Bernhardsson
Yeah, thanks a lot. It was great.
- SGSarah Guo
(instrumental music) Find us on Twitter @nopriorspod. Subscribe to our YouTube channel if you wanna see our faces. Follow the show on Apple Podcasts, Spotify, or wherever you listen. That way, you get a new episode every week. And sign up for emails or find transcripts for every episode at no-priors.com.
Episode duration: 23:36
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode rOoarRoowi8
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome