Skip to content
a16za16z

The Chip That Could Unlock AGI.

Naveen Rao is cofounder and CEO of Unconventional AI, an AI chip startup building analog computing systems designed specifically for intelligence. Previously, Naveen led AI at Databricks and founded two successful companies: Mosaic (cloud computing) and Nervana (AI accelerators, acquired by Intel). In this episode, a16z’s Matt Bornstein sits down with Naveen at NeurIPS to discuss why 80 years of digital computing may be the wrong substrate for AI, how the brain runs on 20 watts while data centers consume 4% of the US energy grid, the physics of causality and what it might mean for AGI, and why now is the moment to take this unconventional bet. Timecodes: 00:00 - Trailer 00:56 - Exploring hardware for running AI workloads 02:02 - Why Naveen built lots of software in a "hardware company" 03:22 - Why start a new chip company? 05:13 - How computing systems went digital 09:26 - Why intelligence is a good fit for analog computer systems 12:30 - What tradeoffs Naveen faced in pursuing his own path 15:23 - The Data modalities Unconventional chips will be best for 16:54 - Does this get us closer to AGI? 21:00 - Where Naveen gets his excitement and motivation 22:37 - What makes Naveen confident that Unconventional will work 24:43 - Unconventional's hiring priorities 26:27 - Career advice for young people 28:19 - What Naveen has done best in his companies Resources: Follow Naveen on X: https://twitter.com/NaveenGRao Follow Matt on X: https://twitter.com/BornsteinMatt Stay Updated: Follow a16z on X: https://twitter.com/a16z Follow a16z on LinkedIn: https://www.linkedin.com/company/a16z Follow the a16z Podcast on Spotify: https://open.spotify.com/show/5bC65RDvs3oxnLyqqvkUYX Follow the a16z Podcast on Apple Podcasts: https://podcasts.apple.com/us/podcast/a16z-podcast/id842818711 Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details, please see http://a16z.com/disclosures.

Matt Bornsteinhost
Dec 7, 202530mWatch on YouTube ↗

At a glance

WHAT IT’S REALLY ABOUT

Analog-dynamics chips aim to cut AI energy and approach AGI

  1. Unconventional AI is framed less as a “chip company” and more as a first-principles effort to understand how learning and intelligence could be implemented directly in physical systems.
  2. Rao argues the dominant digital paradigm (precise, deterministic arithmetic) is mismatched to AI’s stochastic, distributed nature, and that analog/dynamical substrates could compute certain AI workloads more efficiently by leveraging physics.
  3. The main forcing function is energy: AI data centers are becoming grid-scale loads, making efficiency—not just faster chips—the limiting constraint on AI’s growth and ubiquity.
  4. Unconventional expects early fit for models with explicit dynamics (diffusion/flow/energy-based models) while still acknowledging transformers’ practical success and the need to bridge between parameterizations.
  5. Rao suggests time-evolving dynamical systems may better capture causality—one ingredient he believes is missing from today’s AI—potentially moving systems closer to “AGI,” while emphasizing the claim is still speculative.

IDEAS WORTH REMEMBERING

5 ideas

Energy, not compute, is becoming the binding constraint for AI scaling.

Rao points to data centers consuming a meaningful share of the grid and forecasts major new generation needs (hundreds of gigawatts) to meet AI demand, arguing efficiency breakthroughs are required alongside new power infrastructure.

Analog computing’s advantage is using physics directly, not simulating physics with numbers.

Where digital systems represent quantities via finite-bit numerics, analog approaches can embody the quantity in a physical medium, potentially yielding large efficiency gains for the right classes of problems.

Digital won historically because it scaled reliably; analog struggled with variability.

Rao explains early analog machines were efficient but hard to scale due to manufacturing variability, while digital abstraction (high/low states) enabled reliable scaling—an echo of today’s “scaling up GPUs” challenges.

Intelligence may be better served by stochastic, dynamical substrates than deterministic arithmetic.

Because neural networks and brains operate as noisy, distributed dynamical systems, Rao argues it’s plausible to find an electrical-circuit “isomorphism” that implements learning/inference more naturally than matrix-multiply-centric architectures.

Models with explicit dynamics (diffusion/flow/energy-based) are a prime target for new substrates.

He highlights diffusion/flow/energy-based models as naturally expressed via differential equations, making them candidates for mapping onto time-evolving physical systems for efficient computation.

WORDS WORTH SAVING

5 quotes

Most of what we're doing is, at the beginning, it's theory and, uh, really kind of looking at first principles of how learning works in a, in a, in a, in a physical system.

Naveen Rao

We've been building largely the same kind of computer for 80 years. We went digital back in the 1940s.

Naveen Rao

Intelligence is the physics. They're one and the same. There's no, you know, OS and, you know, some sort of API and this and that.

Naveen Rao

I, I'm the opposite of an AI doomer. I think AI is the next evolution of humanity. I think it takes us to a new level, allows us to collaborate, understand each other, and understand the world in much deeper ways.

Naveen Rao

If we are successful here, the world will not forget this for a very long time, right? This will be written in history books.

Naveen Rao

Digital vs. analog computing and why digital won historicallyPhysics-as-computation and “isomorphisms” to neural networksEnergy constraints and grid-scale AI infrastructure limitsDynamical systems, time, and causality as foundations for intelligenceModel fit: transformers vs diffusion/flow/energy-based modelsManufacturability and scaling to millions of devicesTeam-building across theory, architecture, and mixed-signal design

High quality AI-generated summary created from speaker-labeled transcript.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome