Skip to content
Huberman LabHuberman Lab

Dr. Jennifer Groh on Huberman Lab: How Gaze Reshapes Hearing

Groh shows thoughts are multi-sensory simulations running in sensory cortex; the superior colliculus links gaze direction to sound maps, altering what you hear.

Dr. Jennifer GrohguestAndrew Hubermanhost
Nov 10, 20252h 16mWatch on YouTube ↗

CHAPTERS

  1. 0:00 – 2:30

    Defining Thought as Multi-Sensory Simulation

    Groh introduces a theory of thought as the brain running internal simulations in sensory and motor cortices. Using the example of thinking about a cat, she illustrates how concepts may be instantiated as visual, auditory, and other sensory replays. This framing explains everyday phenomena like why verbal distraction can impair visually demanding tasks such as driving in heavy traffic.

  2. 2:30 – 8:30

    Introduction to Groh’s Work and Sensory Integration

    Huberman formally opens the episode, introduces Groh’s background, and frames the importance of multi-sensory integration for perception, attention, and learning. He highlights Groh’s clarity in explaining what thoughts are at the neural level and previews applications for focus and smarter thinking.

  3. 8:30 – 17:00

    Where Vision and Hearing First Meet: Superior Colliculus and Moving Maps

    Groh recounts how a college seminar introduced her to the superior colliculus, a midbrain structure where visual and auditory information interact. She explains dynamic receptive fields that shift with eye position, and how this challenges simple ideas of static brain maps. The conversation connects eye movement-induced map shifts to the stability of our experience despite rapid eye motion.

  4. 17:00 – 31:00

    Everyday Sensory Integration: Phones, Screens, and Ventriloquism

    Using examples like reading on a train, watching movies, and ventriloquism, they show how the brain flexibly binds or separates sights and sounds. The brain expects audio-visual events to co-occur in plausible ways and will often trust vision over raw auditory cues, even when sound physically comes from elsewhere.

  5. 31:00 – 46:00

    How the Brain Locates Sounds in Space

    Groh explains the physics and neurobiology of sound localization, focusing on interaural timing and level differences and the role of ear anatomy. She notes that babies must relearn localization as their heads grow, that one-eared hearing complicates but does not abolish localization, and that precise neural machinery is required to resolve sub-millisecond delays.

  6. 46:00 – 1:02:00

    Self-Voice, Hearing Safety, and Bone Conduction

    They explore why our recorded voice sounds strange to us, the brain’s active dampening of self-generated sound, and the role of bone conduction. The conversation expands into practical hearing health, including volume thresholds, hearing loss, and the differences between ‘inside-the-head’ headphone listening and external speakers.

  7. 1:02:00 – 1:17:00

    Echoes, Distance Cues, and Constructed Auditory Reality

    Groh discusses how the brain infers distance and environment from loudness, thunder/earthquake timing, and room reflections. Rather than hearing multiple copies of sounds from walls, ceilings, and tables, the brain combines them into a single percept, using delays and frequency changes as subtle distance and room-size cues.

  8. 1:17:00 – 1:42:00

    Frequency, Emotion, and the Evolutionary Role of Music

    The discussion shifts to music’s universality and unclear evolutionary function. Groh highlights rhythm as the core invariant across cultures and introduces the idea that coordinated loud rhythmic behavior could have evolved to amplify group presence. They link this to modern military bands, war dances, and political or sporting rituals like the Maori haka.

  9. 1:42:00 – 1:50:00

    Music, Memory, and the Brain’s Use of Song Structure

    They examine how melodies and rhythmic structure dramatically aid memory, from the ABC song to professional songwriters recalling lyrics. The brain appears to use the first words of a verse as a retrieval cue that triggers the rest, showing how musical structure scaffolds language and sequential recall.

  10. 1:50:00 – 2:07:00

    Beyond the Superior Colliculus: Eye Position in the Auditory Pathway and the Ear

    Groh describes her lab’s search for where eye position is integrated into auditory processing. Finding eye-movement effects in multiple auditory brain regions, they hypothesized that modulation might reach all the way to the ear itself. Using microphones in the ear canal, they discovered eardrum movements synchronized to eye saccades, suggesting an ultra-early stage of multimodal alignment.

  11. 2:07:00 – 2:23:00

    Architectures of Sound: Grand Central, Cathedrals, and Room Design

    Huberman brings up acoustic phenomena in Grand Central Terminal and churches to illustrate how architecture channels sound. Whispering galleries and high-ceiling cathedrals create dramatic effects through geometry and materials, changing reflections, delays, and reverberation patterns. Groh explains how these physical features alter perception, intelligibility, and the kind of music that works well in such spaces.

  12. 2:23:00 – 2:48:00

    Thought, Attractor States, and the Mechanics of Focus

    They zoom out from sensory systems to discuss cognitive dynamics: how thoughts cluster, why free association is constrained, and how focus emerges and deepens into ‘flow’. Huberman introduces an attractor-state metaphor (ball bearing in progressively deeper trenches) to describe how context and time stabilize mental states, and both explore why most people misjudge their own attention capacities.

  13. 2:48:00 – 3:07:00

    Practical Attention Management: Interval vs. Endurance Cognition

    Groh describes her own working style as cognitively interval-based: write a sentence, then briefly check something, then return, using those micro-breaks as background incubation time. They compare mental work to sports—some are sprinters, others endurance types—and note how acetylcholine and norepinephrine shape the attentional spotlight. They stress designing individualized systems instead of chasing an unrealistic ideal of nonstop concentration.

  14. 3:07:00 – 3:24:00

    Chickens, Vergence, and Visual Control of Brain State

    A seemingly whimsical detour into Groh’s backyard chickens turns into a concrete demonstration of vision’s power over attention. They discuss farm tricks for ‘hypnotizing’ chickens by drawing a line from their beak, which likely exploits vergence eye movements to lock a narrow focus cone. Huberman connects this to human practices in classrooms and to the broader idea that eye position and focus geometry (near vs. panoramic) strongly modulate global brain state.

  15. 3:24:00

    Phones, Partial Disconnection, and Designing a Modern Cognitive Environment

    The episode closes with a frank look at smartphones and constant connectivity. Groh and Huberman reject simplistic ‘just get rid of phones’ advice and instead discuss realistic strategies: controlled accessibility, using phones for bounded tasks, creating off-grid workspaces, and outsourcing some monitoring of the world to others or to simple tools. They emphasize defining what you want from the phone at any given moment and avoiding endless, frictionless engagement loops.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome