The Problem With Trying To Be Rational - Steven Pinker

The Problem With Trying To Be Rational - Steven Pinker

Modern WisdomJan 20, 202242m

Steven Pinker (guest), Chris Williamson (host)

Cognitive biases, rationality, and the limits of self-correctionIntelligence vs. rationality and the ‘my-side’ biasBayesian reasoning, priors, and forecasting/prediction marketsExpected utility, risk–reward tradeoffs, and bounded rationalityIntuition, overthinking, and making high-stakes life decisionsConspiracy theories, belief, and distrust of institutionsCommon knowledge and the social dimension of rationality

In this episode of Modern Wisdom, featuring Steven Pinker and Chris Williamson, The Problem With Trying To Be Rational - Steven Pinker explores steven Pinker Explains Why Perfect Rationality Is Impossible Yet Vital Steven Pinker and Chris Williamson explore why humans struggle to be rational, even when we know about cognitive biases and formal reasoning tools. They discuss the limits of intelligence as a safeguard against bias, the role of communities, norms, and prediction markets in improving judgment, and how Bayesian reasoning and expected utility can guide everyday decisions. Pinker emphasizes bounded rationality: reasoning itself has costs, so we must trade off optimal decisions against time, information, and cognitive effort. They also examine conspiratorial thinking, declining trust in institutions, and practical strategies for balancing rational analysis with intuition in real life choices.

Steven Pinker Explains Why Perfect Rationality Is Impossible Yet Vital

Steven Pinker and Chris Williamson explore why humans struggle to be rational, even when we know about cognitive biases and formal reasoning tools. They discuss the limits of intelligence as a safeguard against bias, the role of communities, norms, and prediction markets in improving judgment, and how Bayesian reasoning and expected utility can guide everyday decisions. Pinker emphasizes bounded rationality: reasoning itself has costs, so we must trade off optimal decisions against time, information, and cognitive effort. They also examine conspiratorial thinking, declining trust in institutions, and practical strategies for balancing rational analysis with intuition in real life choices.

Key Takeaways

Name and study common biases to deploy them as tools, not trivia.

Knowing concepts like sunk cost fallacy, availability bias, and base-rate neglect—plus their corrective ‘normative models’—gives you a portable toolkit you can apply to unfamiliar situations instead of relying only on gut feel or familiar contexts.

Get the full analysis with uListen AI

Guard especially against my‑side bias, even if you’re highly intelligent.

Smart people are not immune to steering evidence toward conclusions that favor their political tribe, ideology, or professional camp; consciously seek out disconfirming evidence and opposing media, and expose your ideas to critical communities that tolerate free speech.

Get the full analysis with uListen AI

Use Bayesian thinking by explicitly considering priors, evidence strength, and base rates.

Before updating your belief on a test result, news story, or claim, ask: How plausible was this to begin with (prior)? ...

Get the full analysis with uListen AI

Evaluate risky choices with expected utility, not just vivid outcomes.

Multiply the probability of each outcome by how good or bad it would be, then compare options; this clarifies decisions like buying extended warranties (often bad value), wearing helmets/seatbelts, or speeding in a car versus the small but catastrophic risk.

Get the full analysis with uListen AI

Balance rational analysis with bounded rationality—overthinking has real costs.

Information, time, and cognitive effort are limited; you can’t endlessly gather data or compute the perfect answer. ...

Get the full analysis with uListen AI

For major life decisions, look at real outcomes of others, not just your imagination.

People are poor at predicting how they’ll feel after big changes; a better strategy is to find people who made similar choices (e. ...

Get the full analysis with uListen AI

Recognize that many extreme beliefs function as identity signals, not factual claims.

Conspiracy theories often persist because they’re unfalsifiable, morally charged, and serve as a way to say ‘boo, X’ rather than as evidence‑based descriptions of reality; understanding this helps you see when debate is about identity and emotion rather than truth.

Get the full analysis with uListen AI

Notable Quotes

Reasoning itself has costs... You can't spend the rest of your life gathering data, because then your life is gone.

Steven Pinker

All of us are subject to a biased bias, namely all of us think that everyone else is biased, but not us.

Steven Pinker

Good forecasters tend not to be your name‑brand pundits, who tend to have a pretty crummy track record, because they're always pushing their political ideology.

Steven Pinker

When it comes to things that don't impinge on your day‑to‑day life... people believe things 'cause it expresses the right values.

Steven Pinker

You can't spend the rest of your life gathering data... sometimes he who hesitates is lost.

Steven Pinker

Questions Answered in This Episode

How can individuals build practical daily habits that nudge them toward more Bayesian, base‑rate‑sensitive thinking without doing formal math?

Steven Pinker and Chris Williamson explore why humans struggle to be rational, even when we know about cognitive biases and formal reasoning tools. ...

Get the full analysis with uListen AI

What concrete features should we look for in information sources to reliably distinguish genuinely objective outlets from those that merely claim objectivity?

Get the full analysis with uListen AI

How can institutions rebuild trust while being honest about uncertainty and the inevitability of error, especially in fast‑moving crises like pandemics?

Get the full analysis with uListen AI

In what ways can we design online communities and platforms to systematically counteract my‑side bias and conspiratorial echo chambers?

Get the full analysis with uListen AI

How might Pinker’s upcoming work on common knowledge help explain current social phenomena such as cancel culture, viral outrage, or mass coordination online?

Get the full analysis with uListen AI

Transcript Preview

Steven Pinker

Reasoning itself has costs, and the benefit of choosing the optimal decision always has to be traded off. You can't spend the rest of your life gathering data, because then, you know, your life is gone. You've got to, at some point, act on the information you have, knowing that you're taking a risk, but still weighing in the cost of inaction. (wind blows)

Chris Williamson

Steven Pinker, welcome to the show.

Steven Pinker

Thank you. Nice to be here.

Chris Williamson

There was a time not long ago when I thought that reading another Eliezer Yudkowsky blog post, or another Shane Parrish mental model definition about some cognitive bias that I didn't realize that I had, there was a period where I was adamant that that, that was going to be the solution to all of my problems in life, and then, uh, I found out that it wasn't. Why is it that I need a glossary mental models toolkit in order to be able to function? Has making sense of the world always been this difficult?

Steven Pinker

It, it always has. We're, uh, I think we are equipped to reason about cause and effect, and about logical implications, and about probability when they, uh, the problems are ones that we have dealt with all our lives, when they're, i- involve subjects that we deeply care about that impinge on us. But when it comes to general purpose tools that we can apply across the board, including to novel situations, like, "Oh, it didn't occur to me this is another example of the sunk cost fallacy or of the availability bias," namely reasoning, uh, from anecdote. Having those tools at your fingertips as generic, uh, all-purpose cognitive, um, uh, tricks, that you really do need to be reminded of. You need to know the names of the fallacies and how to avoid them, the names of the normative models, that is, uh, uh, rules and, and systems of how you ought to reason, uh, to, uh, uh, to, to deal with n- novel situations, ones that aren't, uh, and abstract situations.

Chris Williamson

Daniel Kahneman got asked, I think by Sam Harris when they did a live event, about, "After all of this time, Daniel, learning about the human brain and biases, has it made you any more rational?" And his response was basically, "No." What's your thoughts on that? Have you managed to make yourself any more rational?

Steven Pinker

Um, somewhat. Uh, I mean, I'm, uh, w- we know from the literature on biases that I'm probably not the person to ask-

Chris Williamson

(laughs)

Steven Pinker

... because all of us are subject to a biased bias, namely all of us think that everyone else is biased, but not us. Uh, so I, I might be the l- the person least equipped to spot my own biases, but I, I tend to think so. And there, there is research that suggests that people who are less susceptible to the classic cognitive, um, fallacies and biases have better outcomes in life, in general. They're less likely to get into accidents, to lose their jobs, um, to, uh, break up their relationships. Uh, uh, uh, as always, these p- pertain to averages. Certainly less likely to be scammed by, um, psychic or, um, uh, medical charlatans. So, uh, uh, so applying the average to myself, I would think so on average. I would hope so.

Install uListen to search the full transcript and get AI-powered insights

Get Full Transcript

Get more from every podcast

AI summaries, searchable transcripts, and fact-checking. Free forever.

Add to Chrome