Skip to content
Huberman LabHuberman Lab

How the Brain Works, Curing Blindness & How to Navigate a Career Path | Dr. E.J. Chichilnisky

In this episode, my guest is Dr. E.J. Chichilnisky, Ph.D., a professor of neurosurgery and ophthalmology at Stanford University. He studies how we see and uses that information to build artificial eyes that restore vision to the blind. We discuss how understanding the retina (the light-sensing brain tissue that lines the back of our eyes) is critical to knowing how our brain works more generally. We discuss brain augmentation with biologically informed prostheses, robotics, and AI and what this means for medicine and humanity. We also discuss E.J.’s unique journey into neuroscience and how changing fields multiple times, combined with some wandering, taught him how to guide his decision-making in all realms of life. This episode ought to be of interest to anyone interested in learning how the brain works from a world-class neuroscientist, those interested in the future of brain therapeutics and people seeking inspiration and tools for navigating their own professional and life journey. Thank you to our sponsors AG1: https://drinkag1.com/huberman Eight Sleep: https://eightsleep.com/huberman ROKA: https://roka.com/huberman BetterHelp: https://betterhelp.com/huberman InsideTracker: https://insidetracker.com/huberman Momentous: https://livemomentous.com/huberman Social & Website Instagram: https://www.instagram.com/hubermanlab Threads: https://www.threads.net/@hubermanlab Twitter: https://twitter.com/hubermanlab Facebook: https://www.facebook.com/hubermanlab TikTok: https://www.tiktok.com/@hubermanlab LinkedIn: https://www.linkedin.com/in/andrew-huberman Website: https://www.hubermanlab.com Newsletter: https://www.hubermanlab.com/newsletter Dr. E.J. Chichilnisky Academic profile: https://stanford.io/3TdtdIg Publications: https://stanford.io/4adV0iM Lab website: https://stan.md/49UpMNL Chichilnisky Lab Make a Gift: https://stan.md/4cmqSns Lab media: https://stan.md/4cgmIgH Stanford Artificial Retina Project: https://stan.md/3IGydAl Stanford Artificial Retina Project Make a Gift: https://stan.md/3ThSt0h LinkedIn: https://www.linkedin.com/in/e-j-chichilnisky-97857429 X: https://twitter.com/StanfordRetina Article & Other Resources Donor Network West: https://www.donornetworkwest.org NeuraLink: https://neuralink.com National Eye Institute: https://www.nei.nih.gov Huberman Lab Episodes Mentioned Dr. Erich Jarvis: The Neuroscience of Speech, Language & Music: https://www.hubermanlab.com/episode/dr-erich-jarvis-the-neuroscience-of-speech-language-and-music People Mentioned Krishna Shenoy: professor of engineering, Stanford: https://stanford.io/49Z9Rhw Jaimie Henderson: professor of neurosurgery, Stanford: https://stanford.io/48Yl2Wb Eddie Chang: professor of neurosurgery, UCSF: https://bit.ly/3SLsjmd Eric Knudsen: professor of neurobiology, Stanford: https://stanford.io/48XgZcW Robert G. Heath: psychiatrist, early brain stimulation research: https://bit.ly/3TAIaFP Brian Wandell: professor of psychology, Stanford: https://stan.md/3TEgVtW Markus Meister: professor of biology, Caltech: https://bit.ly/3x5iE2y Timestamps 00:00:00 Dr. E.J. Chichilnisky 00:02:31 Sponsors: Eight Sleep, ROKA & BetterHelp 00:06:06 Vision & Brain; Retina 00:11:23 Retina & Visual Processing 00:18:37 Vision in Humans & Other Animals, Color 00:23:01 Studying the Human Retina 00:29:48 Sponsor: AG1 00:31:16 Cell Types 00:36:00 Determining Cell Function in Retina 00:43:39 Retinal Cell Types & Stimuli 00:49:27 Retinal Prostheses, Implants 01:00:25 Artificial Retina, Augmenting Vision 01:06:05 Sponsor: InsideTracker 01:07:12 Neuroengineering, Neuroaugmentation & Specificity 01:17:01 Building a Smart Device, AI 01:20:02 Neural Prosthesis, Paralysis; Specificity 01:25:21 Neurodegeneration; Adult Neuroplasticity; Implant Specificity 01:34:00 Career Journey, Music & Dance, Neuroscience 01:42:55 Self-Understanding, Coffee; Self-Love, Meditation & Yoga 01:47:50 Body Signals & Decisions; Beauty 01:57:49 Zero-Cost Support, Spotify & Apple Reviews, Sponsors, YouTube Feedback, Momentous, Social Media, Neural Network Newsletter #HubermanLab #Neuroscience #EyeHealth Title Card Photo Credit: Mike Blabac - https://www.blabacphoto.com Disclaimer: https://www.hubermanlab.com/disclaimer

Andrew HubermanhostDr. E.J. Chichilniskyguest
Mar 17, 20242h 0mWatch on YouTube ↗

At a glance

WHAT IT’S REALLY ABOUT

Decoding Vision: Human Retinas, Neural Implants, and Purposeful Careers

  1. Andrew Huberman and neuroscientist E.J. Chichilnisky explore how the retina converts light into electrical codes that the brain uses to create visual experience, and why the retina is the best-understood and most accessible piece of the brain. They discuss decades of work using high‑density electrode arrays on donated human retinas to map distinct retinal ganglion cell types and their roles in encoding features like edges, motion, and color. Building on that science, Chichilnisky outlines a next‑generation ‘smart’ retinal implant that could not only restore meaningful vision to blind patients but also eventually augment normal human vision and guide broader brain–machine interfaces. Along the way, he describes his nonlinear path through multiple PhD programs, years spent dancing and playing music, and the inner compass—“ease”—that guided him toward impactful work at the intersection of neuroscience, engineering, and human purpose.

IDEAS WORTH REMEMBERING

5 ideas

The retina is the best current entry point for precision neuroengineering.

The retina is literally a piece of the brain pushed into the eye, with well‑characterized layers and roughly 20 distinct retinal ganglion cell types. Unlike most brain regions, it is accessible, can be kept alive ex vivo, and is functionally well mapped. That combination makes it uniquely suited as the first structure where we can realistically aim for cell‑type–specific, high‑fidelity neural prostheses that ‘speak the brain’s language’ rather than brute‑force stimulation.

Vision is encoded as multiple parallel ‘movies’ from different ganglion cell types, not a single image stream.

Photoreceptors detect local light (like pixels), but three main retinal layers transform this into about 20 parallel feature streams. One population emphasizes fine spatial detail, another motion, another color contrasts, and so on, each tiling the entire visual scene. The brain receives this 20‑channel orchestra of electrical patterns and integrates them into a unified percept. Any realistic visual prosthesis has to respect this multiplexed architecture instead of treating the eye as a flat pixel grid.

Current retinal implants work, but only crudely, because they ignore cell types and the natural code.

Existing epiretinal implants use simple electrode grids that treat the retina like a camera sensor. They can elicit blobs and streaks of light and help profoundly blind people detect doorways or bright windows, but they are far from restoring natural vision. They do not differentiate among retinal ganglion cell types or reproduce their precise spatiotemporal firing patterns, so the resulting neural input to the brain is more like noisy cacophony than a structured symphony.

A ‘smart’ retinal implant must record, learn, and then stimulate with cell‑type precision.

Chichilnisky’s proposed device works in three adaptive steps: (1) record retinal activity through a dense electrode array to detect individual cells and their electrical signatures, (2) stimulate and record to build a calibration map of which electrodes activate which cells and with what probabilities, and (3) use that map plus knowledge of the natural retinal code to drive specific cells in specific patterns that correspond to incoming visual scenes. This requires embedded computation and AI to continuously learn the local circuit, making the implant an adaptive, closed‑loop interface rather than a static stimulator.

The same platform for restoring sight could be used to augment human vision.

Once a device can control each retinal cell type independently, it becomes possible to route new kinds of information into vision—more spatial resolution, infrared sensitivity, parallel task‑specific channels (e.g., text vs. motion), or entirely novel feature sets. Chichilnisky notes that even in current crude prosthesis designs, engineers must actively suppress infrared light from cameras; this shows that augmentation is an almost unavoidable neighbor of restoration once electronics mediate perception. Systematic, gradual training leveraging adult brain plasticity could let the cortex adapt to richer inputs.

WORDS WORTH SAVING

5 quotes

It’s not one picture that comes out of the retina and gets sent to the brain. It’s 20 different pictures.

Dr. E.J. Chichilnisky

Nothing that we have learned about the retina since the founding of the National Eye Institute in 1968 is incorporated into the existing retinal implants.

Dr. E.J. Chichilnisky

Don’t expect that the brain is just going to wrap itself around your simple electronic device. Make a smart device.

Dr. E.J. Chichilnisky

If I had the talent to get a few thousand people on their feet dancing by playing music, I’d probably just do that.

Dr. E.J. Chichilnisky

My favorite aphorism is ‘Know thyself.’ And I think it deserves two corollaries: be thyself, and love thyself.

Dr. E.J. Chichilnisky

Retina as a model system for understanding the brainRetinal cell types and visual feature encoding (the neural code)Human retina experiments with high‑density electrode arraysDesign and limitations of current retinal implants for blindnessConcept of ‘smart’ neuroprostheses and vision augmentationImplications for broader brain–machine interfaces and AICareer navigation, intuition, and personal development in science

High quality AI-generated summary created from speaker-labeled transcript.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome