Skip to content
No PriorsNo Priors

No Priors Ep. 29 | With Inceptive CEO Jakob Uszkoreit

"Biological Software" is the future of medicine. Jakob Uszkoreit, CEO and Co-founder of Inceptive, joins Sarah Guo and Elad Gil this week on No Priors, to discuss how deep learning is expanding the horizons of RNA and mRNA therapeutics. Jakob co-authored the revolutionary paper Attention is All You Need while at Google, and led early Google Translate and Google Assistant teams. Now at Inceptive, he's applying these same architectures and ideas to biological design, optimizing vaccine production, and magnitude-more efficient drug discovery. We also discuss Jakob's perspective on promising research directions, and his point of view that model architectures will actually get simpler from here, and be driven by hardware. 00:00 - Creating Biological Software 06:54 - The Hardware Drivers of Large-Scale Transformers 14:32 - Challenges in Optimizing Compute Allocation 23:25 - Deep Learning in Biology and RNA 32:49 - The Future of Drug Discovery 41:41 - Collaboration and Innovation at Inceptive

Elad GilhostJakob UszkoreitguestSarah Guohost
Aug 23, 202335mWatch on YouTube ↗

At a glance

WHAT IT’S REALLY ABOUT

Transformer pioneer builds 'biological software' to reprogram life with RNA

  1. Jakob Uszkoreit, co-author of the Transformer paper and CEO of Inceptive, discusses the origins of the attention-based architecture and why its success is tightly coupled to modern accelerator hardware and community optimism. He argues that future AI progress must tackle elastic compute—models that dynamically adjust computation to problem difficulty and input complexity. Uszkoreit then outlines Inceptive’s vision of treating RNA as biological bytecode and medicines as compilable programs, using large-scale deep learning and custom assays instead of full mechanistic biological understanding. He suggests this black-box, end‑to‑end approach could dramatically expand the reach, scalability, and sophistication of medicines, especially mRNA-based therapeutics and vaccines.

IDEAS WORTH REMEMBERING

5 ideas

Architectures must be tightly matched to hardware to unlock breakthroughs.

The Transformer’s success came not only from the attention idea but from implementations that perfectly fit GPU accelerators, enabling massive parallelism and practical scaling compared to more sequential architectures.

Current models waste compute by not adapting effort to problem difficulty.

Today’s LLMs use computation roughly proportional to prompt and output length, not task hardness, leading to over-spending on trivial queries and under-spending on succinct but computationally hard problems.

Training on generated data can be valuable by amortizing prior compute.

Although synthetic data doesn’t add Shannon information, it can reuse past computational work; retraining on generated outputs effectively concentrates more compute on similar problems over time.

Elasticity in multimodal models is an underexploited efficiency frontier.

Models currently scale compute with input size (e.g., video length or resolution) rather than what is actually needed for the downstream task; more flexible architectures could adjust computation to information density and task complexity.

Deep learning enables powerful biology without full mechanistic understanding.

Uszkoreit argues that, as with language and many historical drugs, we can design effective biological interventions using data-driven, black-box models rather than waiting for complete, predictive theories of all underlying mechanisms.

WORDS WORTH SAVING

5 quotes

At the end of the day, the one thing we know really works in deep learning is making it faster and more efficient on given hardware.

Jakob Uszkoreit

The big question is, does it matter that we may never test architectures that don’t fit today’s accelerators?

Jakob Uszkoreit

Right now there’s no knob for a model to say, ‘This problem is hard, I should use more compute,’ versus ‘This is two plus two.’

Jakob Uszkoreit

We think of RNA as the equivalent of bytecode and what we’re doing is compiling biological programs into RNA molecules.

Jakob Uszkoreit

Maybe the hope to fully understand biology is actually holding us back; the ground truth is simply whether a treatment does more good than harm.

Jakob Uszkoreit

Origins and design principles of the Transformer and attention mechanismHardware–software co-evolution and the limits of current accelerators for deep learningNeed for elastic, input-adaptive compute in modern AI modelsTest-time search, depth-adaptive transformers, and amortizing computeDeep learning as an alternative to mechanistic understanding in biologyInceptive’s vision of RNA as biological software and bytecodeData generation, assays, and the ‘wet–dry’ integration at Inceptive

High quality AI-generated summary created from speaker-labeled transcript.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome