Jay Shetty PodcastThe SECRET Loop That Keeps You Glued to Your Phone (Most People Never Notice It)
At a glance
WHAT IT’S REALLY ABOUT
How algorithms and our instincts trap us in addictive scrolling loops
- Algorithms primarily optimize for attention and watch time, learning from every pause, click, rewatch, and share to serve increasingly sticky content.
- The “glued to your phone” effect is a feedback loop: our emotionally charged engagement trains systems that then narrow our exposure and amplify outrage and division.
- Many harms attributed to algorithms are also driven by human tendencies—negativity bias, comparison, and identity-signaling—meaning even “noise-free” platforms can reproduce echo chambers.
- Doom-scrolling can raise cortisol and anxiety and create learned helplessness, intensifying the sense that we lack control and must keep checking.
- Solutions require both structural product changes (chronological feeds, friction before sharing, transparency/audits) and individual habits that actively reshape the feed and strengthen emotional mastery and critical thinking.
IDEAS WORTH REMEMBERING
5 ideasThe algorithm isn’t omniscient—it’s a mirror powered by your inputs.
It repeatedly asks, “What will keep you here the longest?” and learns from your micro-behaviors (hovering, rewatches, shares). Your actions don’t just reflect preferences; they train the next version of your feed.
Addictive design hides choice rather than removing it.
Features like infinite scroll and autoplay reduce deliberation and extend sessions (cited study: disabling autoplay shortened average sessions). The result is passive consumption that feels like “I didn’t choose this,” even though the system is responding to prior engagement.
Outrage spreads because people reward it, not only because platforms push it.
Research cited (Yale) suggests moral outrage gets social rewards (likes/retweets), training users to produce more outrage. This creates a human-driven incentive loop where “what performs” crowds out nuance and honesty.
Misinformation wins on engagement, and algorithms can’t distinguish truth from clicks.
The transcript cites that false news is more likely to be retweeted and travels faster than true news; recommendation systems then amplify what’s already emotionally potent. The weakest link is often our impulse to share before verifying or reading.
Even removing algorithms may not fix polarization—social sorting is a core driver.
A University of Amsterdam experiment described a stripped-down network (no ads/recommendations) where AI bots still formed echo chambers and rewarded extreme partisan content. That suggests “platform mechanics” and “human tendencies” can both generate division.
WORDS WORTH SAVING
5 quotesThe algorithm doesn't just know us, it depends on us, and if we learn how it feeds, we can decide whether to starve it or steer it.
— Jay Shetty
In plain words, the algorithm isn't a mastermind. It's a machine that asks one question over and over again. "What will keep you here the longest?"
— Jay Shetty
The algorithm's goal is not to make us polarized. It's not to make us happy. It's to make us addicted and glued to our screens.
— Jay Shetty
If the algorithm is made of us, then changing it doesn't start with code. It starts with character.
— Jay Shetty
When you like something, you're telling the algorithm, "Show me more of this." When you hover over something, you're saying to the algorithm, "I pay attention when you show me this." When you comment on something, you're saying, "This is really important to me." And when you share it off the platform, you're saying, "Fill my feed with this." You're co-creating your algorithm. You're actually coding it.
— Jay Shetty
High quality AI-generated summary created from speaker-labeled transcript.
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome