Modern WisdomDaniel Schmachtenberger - Building Better Sensemaking | Modern Wisdom Podcast 348
EVERY SPOKEN WORD
150 min read · 30,488 words- 0:00 – 0:25
Intro
- DSDaniel Schmachtenberger
I actually want to understand the world I live in as best I can because I actually hold that life is meaningful, and I hold that my life could be meaningful, which means that my choices can be meaningful. And so I want them to be informed as well as they can be. (wind blowing)
- CWChris Williamson
What I think people really liked
- 0:25 – 14:53
What is a Sensemaking Agent?
- CWChris Williamson
about our first conversation was that we brought some of your work down to an individual level. So a friend referred to it as creating a narrative of resonance. And given that, I thought it would be nice to start by looking at something that you talk about a lot, which is sensemaking, but from the level of the actor. So how do you define a sensemaking agent?
- DSDaniel Schmachtenberger
Well, I don't know what, um, background the other people listening to your show will have on the general topic. When you're mentioning individual agents, I think you mean individual humans. Obviously, an organization can act like an agent, um, act like a, a, a unit of agency. Uh, but I think you mean, uh, individuals seeking to make sense of the world they live in better. We oftentimes talk about sensemaking at a societal level, meaning, uh, and currently it comes up a lot, like, why do we have such a hard time coming to clear understanding about what the nature of climate change is, or what the nature of COVID viral origins or vaccines, or whatever, you know, systemic racism or how we should deal with nuclear disarmament, or why does it seem like there are such radically divergent views? Meaning the way that we're sensing the world is leading to very different senses of the world, which of course leads to very different senses of what should happen, which makes it very hard to coordinate, which makes it very easy to have conflict. And so when we're talking about sensemaking, we're usually talking about it in the context of shared sensemaking as a prerequisite for shared choice-making, w- i.e. governance. Sensemaking is not the only prerequisite. When we talk about governance, and by governance I don't mean government, which is a, maybe a specific, um, establishment that has, uh, rule of law and monopoly of violence. But governance meaning some process by which a bunch of people who want different things and see the world differently come to coordinate force in some effective, positive, productive ways. Um, so we're talking about how do we get people to have some kind of coherence or coordination between the choices they make such that it isn't making choices that we would think of as crime or something that really messes up each other's choice-making capacity, and where we need to coordinate on choices. Like, we're not gonna all make our own roads and things like that, that we're able to coordinate effectively regarding shared resources, shared infrastructure, shared choices. It happens to be that lots of humans who don't know each other and have experienced the world differently and feel different things and want different things, coordinating on what the right choice is is a tricky thing, right? 'Cause they have a different sense of what is and what they want and what should be. So this is why for most of human history, the number of people that would coordinate was small. Tribes were small. They classically stayed beyond, you know, smaller than the Dunbar number, give or take 150-ish, where there were no strangers. Everybody that you were coordinating with, you'd known your whole life. They had known you your whole life. Everybody had the same shared basis of experience, and everybody could be in a single conversation around a campfire. And so the ability to coordinate, we could coordinate sensemaking because we were sensing the same stuff. We were living in the same place, right? We weren't even reading different books. We weren't even watching different TV shows. Like we were... nobody had been... w- we weren't speaking different languages. We were exposed to the same stuff. And so you could also fact-check anybody just by o- looking there, right? By just m- having such a shared basis. And it was pretty easy to unify values because the culture that had conditioned them was the same culture. Um, so there might be little differences of weighting. And so then the ability to, you know, unify choice-making if we have shared values and we have shared sense of the world, so both what we think is and what we want to be, uh, it's not that hard. Um, once we started to get to larger scales where now I've got to maybe make compromises for strangers, I've got, you know, we're, we're gonna have some coordination with people who I don't have any shared sense of real fealty with or whatever, that becomes a different topic and where they really do see the world differently. And so this is where mostly the order came through some kind of imposition or oppression or top-down force, which is why it was largely empires. O- And then that still meant a number of people smaller than Dunbar that would equal the king and a, a council making the decisions and imposing it by rule of law enforce- on everybody, right? So shared sensemaking didn't matter 'cause people didn't really have meaningful choices. They were gonna do the thing that they were gonna do within that context. So the idea of something like democracy or a republic or an open society where some humongous number of people who don't know each other, who don't have the same experiences are all going to not just do totally different stuff that creates chaos, but also not need somebody to kind of rule. They're gonna find order that isn't imposed, it's emergent order. That's actually a wild idea, right? Like it's a really fucking wild idea that that would even be possible. And, you know, the, the modern democracies came following the Enlightenment, the cultural kind of European Enlightenment with the idea that we could have this thing called the philosophy of science where we could all measure the same thing and get the same result independent of biases. Didn't matter, you know, what we thought beforehand. If we measure the speed of sound in the right way or whatever it is, we're gonna get the same result. So there's this unifying nature to objectivity that allows us to sensemake together.... which is why Karl Popper, who advanced the philosophy of science, was a guy who termed open society, right? That we can do open societies based on the ability to do shared sense-making using- using a more methodological rather than, "I had divine revelation and it's true and you don't know," kind of approach. (sniffs) And- and then- but like we said, the- the idea of governance is that there's some kind of emergent choice-making or order a- at the level of the choices we're making. The choices are both the result of our sense-making. What do we think is actually happening? What do we think the causes of what's happening are? And if we do X, what do we think will happen, right? That's kind of forecasting sense-making. But it's also, what do we want to happen, which is our values, which is not sense-making, right? Sense-making is sensing what is. The values is what ought, what do we think ought to be, what do we really care about, so we can call that values generation or meaning-making. (sniffs) So sense-making and meaning-making are the prerequisites for choice-making. The thing that we call governance in an open society is that there's some coordinated process for choice-making that doesn't have to be imposed by a king, doesn't just turn into, "There's no way we can get on the same page, so it has to be chaos," because we can sense the world together and we can sense each other's values and find a higher order set of values that includes everyone, so this was another part of the Enlightenment, was the idea that we could do a dialectic on values. You could say, "I- I really believe in doing X," whatever X proposition is, and we're like, "Why do you want to do it?" "Well, 'cause it's in service of, uh, decreasing infant mortality and the value that you have is infants." And we're like, "Yeah, but if you do that thing, it'll be bad for this other thing because it whatever. It'll damage the water supplies, but what you care about is the water supply." Well, let's not focus on the proposition for a moment. What you value is children. What you value is the water supply. Let's hold all those as legitimate values, right? What you value is individual freedom. What you value is the responsibility of the individuals to the collective that- that they are benefited by, so the ability to hear each other's values and synthesize them and say, "A good solution will meet everybody's values as best as possible." So often, we get stuck with, you know, a proposition is created to meet some value before even looking at what all the values are, and so it benefits the environment, but it hurts the economy, or it benefits the economy and hurts the environment, or whatever it is. So those who feel particularly connected to the thing being hurt are like, "This is terrible. We have to do everything to fight it," and those who feel connected to the thing that's being benefitted are like, "This is critical. Someone finally gets us." Now, those two sides is- have to become enemies if the only chance they have is to vote on a preexisting proposition that is a shitty proposition 'cause it was based on a theory of trade-offs between those that was never even consciously explicated. They never even said, "Oh, this is gonna harm this thing. Both these are values. Is- Can we take these values and find a better way forward, a better proposition that maybe could meet them both better?" (sniffs) Maybe rather than that bridge that is going to harm the environment the way that it is, um, but helps transportation, which will help the economy, a barge could do it without harming the environment, or we could just build better local economies on both sides, or whatever it is, right? So the dialectic process is where I want to hear the values that you care about, and so you believe everyone needs to be vaccinated, or no one should be forced to be vaccinated, or everyone should have to wear a mask, or nobody should- What is the value you care about independent of the strategy? The strategy is the way to fulfill a value. There is something legitimate in the value even if... Then the sense-making about, is that thing about vaccines, or about masks, or about whatever, is that true, is separate from, is that value legitimate, right? And so if- so we don't have participatory governance in the US. We don't really in the world in any very meaningful way. We have the- the legacy story of it, but we don't have a population hardly anywhere that are really seeking to understand the world we live in where there- where the government is gonna make choices on stuff, and for the government to be informed by the people that we understand enough to be able to weigh in well, and we seek to understand our own values and other people's values (laughs) and be able to have the dialectical conversations to see if we're missing some sense-making, somebody knows some stuff we don't and we really want to hear it rather than have our, um, in-group continue to feel right by saying how dumb the people in the out-group are. So there isn't anything like participatory governance, which is why open societies are basically failing and doing shittily while the authoritarian societies that aren't even claiming to do that and are just doing top-down government better are just doing better at long-term planning and infrastructure. So, you know, uh, people will hear me talking about sense-making. Usually, it's in this context of, how do we develop better capacity as a society as a whole for everyone to be doing a better job making sense of the world than just believing whatever happens to come through the Facebook feed that is algorithmically optimized to appeal to their current biases and kind of l- limbicly hijack or maximally bother them and drive in-group dynamics where people's fear of being out-group by believing the wrong thing in there w- w- i- messes with subtle, deep tribal biases? How can we do a better job with sense-making at the level of training individuals, at the level of how we change education and train people, and at the level of the quality of media we put out, at the level of how we design the information architecture so that rather than a Facebook or YouTube having an algorithm to maximize time on site, that it does through appealing to your biases which make you spend more time, it- which makes people on the right more right, the left more left, anti-conspiracy theorists, more hate conspiracy theorists, conspiracy theorists farther down that direction? And so there's this just hyperfragmentation as a result of the dis- uh, the financial model of this information technology, right? So how do we make better information technology? How do we make, uh, better media
- NANarrator
Yeah.
- DSDaniel Schmachtenberger
... and for it to stay? How do we do better education? All those things so that we can actually have better sense-making about what is real-... better dialogue and communication around what is meaningful, what are the values? So that, that, those types of conversations can lead to what would a good proposition even be that meets, that factors all of what's real as constraints, factors what matters as constraints and works to find the best proposition forward. So if we take as the background, that's the societal context of where we're usually coming from in talking about the needs for sensemaking, that was a long preface (laughs) to then say you want to bring it to the level of the individual and say, "All right, so I'm not trying to fix Facebook's algorithms for sensemaking right now. I'm not trying to necessarily fix participatory governance or democracy or the fourth estate or public education." I'm trying to say, "How do I as a person do a better job of making sense of the things that I should make sense of that affect the choices I need to make in the world? What things should I just actually not bother myself with because I really don't have a choice and it's not the best use of my life energy? What things should I or do I care to and how do I do a good job and how do I know if I'm doing a good job?" Because almost everyone that we're sure is wrong is sure they're right. And so we are one of the everyone that they're pretty sure is wrong, right? And so we should all be pretty dubious of our own certainty 'cause statistically, we're almost certainly wrong about most of the things that we're sure we're right about. And that's dangerous, right? Like it's dangerous that I'm clear about lots of things I think other people are wrong about. And I'm clear about a lot of things I think people in the past were wrong about. And I'm clear about things that I was wrong about in my past, but I probably can't say anything that I'd say I believe today that's probably wrong. That's, that's tricky because it's probably mostly wrong. And so how do we... And now when ego gets tied up in that, right, and then when belonging gets tied up in that, if I don't, if I don't say the right narrative about masks or about vaccines or about social justice, I'll be totally out-grouped, right? Because now I'm a anti-vaxxer or a sheeple or a whatever it is. Um, so there's a lot of reasons to have everybody double down on their worst traits of, uh, unwarranted certainty and sanctimony. So if we want to ask the question th- ah, how important is it to our own life to develop our sensemaking, how do we know how well we're doing with it, how do we do it, we can get into that more.
- CWChris Williamson
Absolutely.
- 14:53 – 18:43
Decision-making in Politics
- CWChris Williamson
Uh, one of the things that I've got in my head there is how much more considered and slow decision-making at the governance level would have to be in order to factor in all of these different values and choices. You know, expediency is something that people value. If the bridge needs to be built and people think that it's going to make their life better, but in order to factor in everybody's different values, it's going to take two years of debating and planning and all of the rest of it, and, and that situation of consideration and being moderate and, and more, uh, nuanced with your thinking, that happens at the individual level as well, right? It's far easier to just react, take something that we think is the closest approximation of correct and just move forward.
- DSDaniel Schmachtenberger
No. This is a gibberish argument. So let's say we don't have clear sensemaking on a topic, eh, but we have to act. Why do we have to act? Are there r- are there real consequences or just made up bullshit like an election cycle or whatever that we could change the way we're doing governance? If there's real consequences, how consequential is it to get it wrong? Uh, well, it might be more consequent- consequential in taking more time, right? We- It depends. There are times where you have to make consequential choices under uncertainty where not choosing fast enough is also a choice. That's a real thing. But it's that way less often than we pretend that it is. And so then, what is the consequence if it, of getting it wrong and doing something much, might be much more harmful? But also, what's the, what is the time effect of moving forward with something because we just have to move forward that a huge part of the world thinks is wrong and bad and are going to actively keep fighting? Like, how efficient does that end up being? They're all gonna pay lobbyists. They're all gonna help pay for academics to sponsor counter-narratives. They're going to, uh, pay for politician, uh, uh, candidacy processes and whatever it is. So how fast does it really end up being to try to advance something that half the world thinks is a terrible idea? And so what you can see is we try that. We're like, "No. The, the science is settled." That's a famous bullshit line on a million things, right? Um, and you'll see it on both sides of all kinds of things. Um, "The science is settled," is just a nice way to say, "My unwarranted certainty is true." Um, but the, the science is settled. Climate change, the thing it is. We gotta move forward. We, and we don't have time to educate you dumb fucks anymore, um, about it, and we're just g- you know. So we're gonna carbon trade. This is the way to do it. Or cap, cap and trade, carbon tax, whatever. Okay, well, how well does that work when all of the groups are gonna keep lobbying against it and getting Republican candidates who will then try to undo the laws in four years that, that, those... You spend four years trying to do the shit knowing that the next four years will undo all of it, and actually you never even plan on doing something that won't have returns within those four years 'cause it won't possibly get you elected and all the things that need done need to have 10, 20, 30, 50 year timelines and no one will ever even look at it, and most of the time you're actually just working on getting political support and campaigning to get reelected.So, the expediency we just have to move forward is usually a bullshit argument for someone in a power position moving forward in a way that will advance their power position, with plausible deniability that it's something else. If you want to move a civilization forward, either you're moving forward in a way that everybody's getting on board with, or you're deciding to use force to oppress the fact that everybody's not on board, or you're deciding to keep fighting the fact that they also have force. Like, you just have to be realistic about that.
- CWChris Williamson
It's interesting
- 18:43 – 22:44
Becoming Better at Sensemaking
- CWChris Williamson
the option of delaying a choice is also a choice. There's always a third option of being more considered. Yeah, I like that. Okay, so back down to the individual. How, how can someone become an adept sensemaking agent? We're at the mercy of certain things that individually, immediately, we can't control. Therefore, making the most of the capacities that we do have is a good idea.
- DSDaniel Schmachtenberger
I would not say the answer for this is the same for everybody based on what they feel called to, their dispositions, their, their vocation, and their kind of sense of what their mission to do is. Um, if someone is a nurse caring for patients, if someone is a mother raising children, how much does them understanding what's really happening with the digital yuan and whether it's going to become the reserve currency of the world, or whether or not the US microgrids are susceptible to EMP attacks. Like, how much does their sensemaking matter? How much agency do they have to do fucking anything about that? Pretty little. How much does it likely stress them out? Probably a lot. Does that make them a better nurse or mom? Probably worse. Could that same energy be applied to doing better sensemaking about tuning in to their children and their patients better and where they actually have some agency? So, it, when we recognize that sensemaking is to inform choice-making, right? W- do I have choice around this thing? Like what is the basis of why I'm wanting to do sensemaking? Is it simply because I need to know which side of the narrative war I'm on because I think I have to be on one of the sides? What if I just don't? What if I say, "I don't know. I, I don't, I don't, I don't know what I think about systemic racism." No, you have to know it, but I don't. Well, you know, and people can try to do a forcing function then. "Well, then you're complicit," or whatever it is. Okay, well I can put a huge amount of time and energy and then still not actually have any real agency to move this thing forward given m- where I am in my life. But I can put that energy into studying better nursing or whatever it is, right? So, I don't think that the idea that like everyone should be deeply informed about all of the existential risks and understand the entire effect of the tech stack and globalization and planetary boundaries and geopolitics is like a thing that everyone should have. I don't think that's true. Um, so the first question is like, what matters for me to make sense about based on what choices I actually have to make in my life? That's an important question, um, 'cause it's easy to get sucked into the thing for somewhat unconscious reasons. Now we can talk about how to do good sensemaking on geopolitical and environmental and complex scientific topics there. But the first part is make s- like make sure that the reason that you feel called to do that makes sense. Um, and I'm of course not saying that if you don't have a company or an organization or you're not a politician in, of some way that can directly affect that thing, you shouldn't know anything about it. There is something about general informedness as a citizen that can have value. But you do want to pay attention to like, I have finite units of life energy and where do I want to put my attention that is also connected to my creativity. So, I want people's sensemaking to be informing their creativity, right? To be informing their agency and their choice-making and the quality of life for themselves and the people they touch and for the world a- at large as they can touch it. Um, now h- how to actually do good epistemology we can get into next, but does that, does that part make sense?
- CWChris Williamson
Yeah, absolutely.
- 22:44 – 44:12
The Underlying Principles of Sensemaking
- CWChris Williamson
I'm interested to hear what the underlying principles are. Presumably there must be a structure upon which are some commonalities that all sensemaking agents, whether they be the nurse, the government official, the creator, the mother. Are there some commonalities between all of them?
- DSDaniel Schmachtenberger
Are there commonalities in how to do good sensemaking in any domain regarding, regardless of the domain? Sure. They're, they're gonna be different. There are certain places where it's like, how to get, how to really get this particular kind of back flip is not something I get from reading Wikipedia. Like I only get it from trying to do b- back flips. Like it's an embodied sensemaking. There's like a, "Oh, it clicked, and I got it," and there's no amount of reading Wikipedia that's gonna, or watching YouTube that's ever gonna give it to me. Um, so there isn't like one type of sensemaking. Like there's no amount of reading music theory that will actually get my fingers to grok how to play Chopin. So, there are different kinds of creative capacities that require different kinds of like... 'Cause you're sensing how something works, right? Sensemaking is not purely cognitive. It's taking your senses and having a pattern emerge. Sensemaking of like, "Ah, I got it." We've all had that experience playing the piano or trying to do the back flip where it's like, "I got it." That's sensemaking of a type. That's a bunch of sensory perceptions that came into a pattern where now I have it in a way that can inform my creativity. Right?But I'll stick in the cognitive domain for now, um, since that's largely what, what I think most people are talking about. Some super helpful basic tips. Um, if I'm trying to make sense of a topic that is conflicted, where the, where the public opinion on it, or even the scientific opinion or whatever on it is highly conflicted, I should understand the conflicting views before coming to one on my own. That's a, that's a very helpful thing to do. So, um, you know, you were mentioning, how do I do better sense-making of nursing? Well, let's say I'm a nurse and it's COVID time and there's, like, major conflict of does ivermectin work or not, and should we be doing this with people, and, um, you know, who do we think actually has too much contraindication for these vaccines or whatever it is. Like, those are places where a nurse would actually maybe want to do some sense-making. Now they might not feel that they have any time. They might not feel the agency that if they came to think something different than hospital policy, they could do anything other than get fired. But they still might care anyways, because they're like, "Fuck, I, I signed up to this thing because of a calling and an oath and I need to know." Right? So one place there I like to start is I like to see, okay, are there two primary narratives or are there a few narratives, right? Let's begin and say there's two narratives. Uh, ivermectin really works and it's awesome, it doesn't work at all and it's dangerous, right? Typically there's more than that. Typically there's like five or six. Maybe it works early case but not later, or works for these kinds of situations, or there's some indication it works but we don't really know, or, um, whatever. (clears throat) But let's take kind of primary narrative camps, because in today's w- world, most people are trying to sense-make between preexisting narrative camps. Um, and it's kind of important to understand... (sighs) ... that there is a very strong incentive for everyone to fall in narrative camps. There are basically these strange attractors, and so there are underlying forces that drive what you can think of as polarization, to rather than just like, well, whatever is true, it's going to be... It's much easier to believe something is true that someone with expertise says with a lot of certainty and other people agree with. Especially if there's a lot of literature and I'm unskilled or I don't have the time, how am I gonna read all of it myself? And then (clears throat) you get a narrative, and then you get people who say that narrative has something false with it and they do a counter-narrative that is usually an anti-narrative, that are also kind of smart, and typically based on in- uh, uh, either a different emphasis in values, "Hey, this is about personal freedoms, this is about public health." Um, "This is about my right to decide on my own body, this is about not being a grandma killer," whatever it is, right? Um, sometimes it's just a difference of values that affects their sense-making, because they're sense-making the thing that seems most aligned with their own values. They're not actually paying attention to the sense-making, they're looking at the narrative ab- of truth that fits the value that they seem to care more about, right? I don't think anyone should be comfortable with the idea of more imposition on people's personal freedoms than necessary, and I think everyone should be dubious of anyone who feels that they are in a position to say what is necessary and impose it by force. Like, everyone should be dubious. Oh, you have a monopoly of violence (laughs) that can impose necessary limits on everyone's freedom, and who is the authority? Like, what is the authority process that is not influenced by power or fucked up motives or ego or mistakes at all that deserves to have that fucking power over everybody? Like, everybody should be legitimately concerned about that. That there is such a thing as adequately legitimate authority to wield monopoly of power. Simultaneously, everybody should be legitimately concerned about unnecessarily being a grandma killer, right? Like, about taking a risk as a young person that would be not that consequential for you probabilistically, that'd be way more consequential for other people, and everyone should have some sense of like, yeah, we actually have a duty to each other, we have a social responsibility, a social fealty, that insofar as we're affecting each other, we're not just automata. Um, and if we can affect each other invisibly but still tangibly, there's real consequence to that. Like, even as libertarian as I want to be, non-aggression, I, I don't have the right to come up and hit you in the face, right? Well, do I have the right to dump toxic waste in the river on my property if you live right downstream from me and that's the river that feeds your well? No, I'm aggressing on you, right? So if I'm sneezing and coughing in your space, and I might have, (laughs) have an infection, like there's a real situation there regarding what is the limit of personal sovereignty and what is the limit of civic duty? And everybody's comfortable... And it's interesting, 'cause a lot of people who really like libertarian sovereignty feel comfortable with the idea of civic duty to go die in war. Um, including where there's a draft, if it has to be, right? Not just even where it's voluntary. So we're like, okay, there is a relationship between, um, between the way that the individual affects the larger wholes that they're a part of and is affected by them, and so how do we maximize everyone's liberty and maximize the well-being of the whole in a way that no one's liberty is unduly harming anybody else's, right? (sighs) Obviously we'd like to do that with emergent order rather than imposed, so rather than a law doing lockdown, more conscientious citizens who understand more and care more would be better. Right? If you had citizens who cared more and did the research better and really came to understand it better, then they wouldn't need police, they would be self-... that then you don't have to worry about who is the authority that has a monopoly of violence. It's, there is a population that is well-educated, conscientious, communicating with each other respectfully, and self-policing in that way, right? Self-monitoring. (sighs) So, um, so the point is that oftentimes, that there are these values that have to live in a dialectical relationship, but we will forget that and focus on one of them. Then we'll focus on the sensemaking narrative that supports that one, and think, and then we'll weigh it's the scientists who believe in it as being credible. And when people quote them, we're like, "This is a credible scientist." But then when someone quotes the credible scientist on the other side, we'll say, "You're doing a logical fallacy of appeal to authority." And it's like, really? You just did that? Like, it's an appeal to authority when they pick their scientists, but this is a credible person when you do it. Um, and that kind of subtle bias is just all over the place, right? And it's fundamentally a kind of bad faith sensemaking that people don't even realize they're doing most of the time. It's what we call motivated reasoning. Motivated reasoning is tricky. Um, and there's so many reasons for it. Sometimes I wanna be certain just because I'm fucking scared to have, to say I have no idea what's going on about super consequential stuff. There's a pandemic. Are there variants gonna get worse, or are, is the vaccine gonna make them worse? Is everybody gonna die? Am I ever gonna be able to go outside again? I ... this guy seems really certain, and the story is not too scary. (laughs) Or whatever it is. Like, sometimes there's deep subconscious stuff, like my desire for safety, and certainty seems to be a path to it. So the same place in people that get scared of the dark is just when they can't see what's going on, they project nasty stuff into it, or they get scared of deep water, 'cause when they can't see what's going on, they project nasty stuff into it, uh, gets scared of the unknown in general. Projects nasty stuff and then wants to pretend there is no unknowns, so they want excessive certainty about everything. So they get scared of death, and so then they wanna project certainty about what happens in the afterlife, and make up religions. Um, when you recognize that how much of reality is unknown and actually unknowable, there is no way through other than actually, there's no way through well, with grace. It doesn't involve deep friendship with the unknown, where you don't project nastiness into the dark spaces. You just say, "I don't know, like, just a lot of things in life have been really interesting so far, and I'm curious what happens," right? And I, I, rather than pretend that I know, and possibly steer really wrong, I'd like to just keep my eyes open and keep paying attention. So I was giving one example of how motivated reasoning and values and fears and all like that can affect people's sensemaking. There are other things that can affect people's sensemaking we can get into, but the first simple principle, uh, you're saying across any domain, uh, you know, lab leak hypothesis versus natural zoonotic origin, or anthropogenic climate change being terrible really soon versus not, or whatever it is, find people who seem very well-researched and earnest, who hold strong versions of the various narratives, and see if you can study their narrative and their reason for it well enough that you can steel-man it. You can be like, "I, I actually really get and can give like an essentialized version of this." Then when you see the difference between them, see if you can come to understand why. Like, are they drawing on different data? Are they both cherry-picking their data and it's probably something that neither of them are saying? They're, there's a lot of data and they're each cherry-picking, they're each framing. Um, is, is one of them following much more motivated reasoning and less good empiricism than the other one? But generally, that dialectical process where, one, it'll point out to you where you're faster to start to believe in something rather than something else because of your own biases. And if you notice that, and you realize that your bias will be, it's biased, which means it misleads you. Mislead, bias is, like, if I let go of the steering wheel, my car starts going left, I have to go actually get it adjusted. If I let it go and it goes, goes right, like, that's dangerous. I wanna let go (laughs) and it stays straight. Bias, cognitively, is the same thing. It means I'm gonna be veering off of reality naturally based on what appeals as more true or less true to me because of various things, right? Dramas, conditionings, partial value sets, in-group identities. The fact that the world of my childhood is not a fair representation of the whole world, but I was early imprinted that it was, so I'll take those imprints onto everything. So people should be fairly scared of their own biases, right? Like, they should want to seek out and find their biases and correct them. And so anyone who's, who gets upset when someone says, "I think you're wrong about something," is actually fucking up their own life. If you, if you protect your biases because they're protecting some s- sacred thing, like your fear of uncertainty or whatever, then you won't grow in this way, and your life will stay upper bound at whatever the limit of truth that those (laughs) allow you to understand, and whatever vulnerable things are underneath it, and whatever partial values are underneath it. But if, if instead you're like, "Actually, I don't think I can navigate while on a busted map. It just doesn't make sense. Any place where my map's off, I want to know." By definition, I can't see my own blind spots. That's what a blind spot is. Everybody has biases. I, I can see everybody else's, I'm just pretty sure I don't have any. The best gift somebody can give me is where they actually tell me. If they're like, "I think you're off about this." Now-... they can be an asshole and just judge me or whatever, and I still want to listen. Maybe they're wrong, but I want to listen to see if there's possibly a gift in it. But if they're my friend, they'll be like, "Look, I know you're really trying. I love you. I agree with a lot of things. I think there's something you're missing here." I'm like, "Tell me." 'Cause the worst thing I can imagine is that I harm the things I care about, or serve what I care about less well, because of something I can't even see, and somebody saw it and didn't help me. Um, so having ... This is another principle of sensemaking. Have friends that disagree with you on really deep things, like have different biases. If you're a liberal, have conservative friends. If you're strongly, you know, LGBTQ, et cetera, have traditionalist friends. Um, don't be so sure that your moral set is the only und superior moral set, that your sensemaking set is the only one. Have friends who have different orientations, and see if you can actually see the world through their eyes in a non-pejorative way. Like, "Oh, yes, I can see if I was as uneducated and traumatized-"
- NANarrator
(laughs)
- DSDaniel Schmachtenberger
"... and like, whatever is them." Um, but see if you can actually be like, "Wow, yeah, I, I can f- feel the clarity and rightness of seeing it this way." And, and, and so have friends that sees things differently and ask them their take on things, and listen. And ask their take on your take on things, and on you. Um, so this is one of the other things that I think is ... This is one of the other things very destructive with social media is the filter bubble phenomena is, uh, since Facebook is gonna give me what's ... It, it's not trying, like there's not a person having an agenda. There's an algorithm that is optimizing time on site, and it just happens to be when I see stuff that disorients me and I don't ... I'm, I'm getting less certainty when I want more certainty, I bail. But when I want more certainty and I'm getting more in-group validation, and I'm only getting outrage at the out-group that makes me feel even more like I need to double down or whatever it is, I spend more time so it just happens to be that the appeal ... And I don't even know I bail. I just keep scrolling in the fast infinite scroll 'cause it didn't capture my attention, because there's so much shit in the infinite scroll that I'm only gonna stop to look if the person is hot enough, or if it looks like something that my brain is pre-triggered to say that's important. Pre-triggered to say that's important means it appeals to an existing bias. And otherwise, I just scroll, right, and just kind of don't even notice that I passed it. Um, but what that means is that I'm going to have both content and people in the nature of that world that will be confirming my biases rather than correcting them. And so then, of course, you will get increasing polarization on everything as a nature of even just the info technology infrastructure itself, right? If people haven't seen The Social Dilemma, they should watch it. It covers this really well. Um, so, uh, just to even, if you keep Facebook at all or whatever social media, curate it. It ... For the most part, the more time you spend on it, the less good your sensemaking will be, 'cause it is optimizing for something that is not your sensemaking. But if you're gonna use it, curate it. So I went and intentionally found, uh, groups and public intellectuals that represented opposite sides of every topic, and I liked and followed all of them to just confuse the fuck out of the algorithm. 'Cause it's like, who likes the Sunrise Movement and the Cato Institute at the same time, and then what if, you know... Um, and then to be careful, 'cause I know it pays attention that if I'm gonna like stuff, I want to kind of have a balanced distribution of my likes, which can confuse other people socially. Um, but it's because I want to see a representative feed, right? Um, and specifically, I want to pay attention to when I notice that I have a leaning on a topic, I wanna find the best thinkers that disagree, and I wanna read their stuff more, um, to see, am I missing stuff, right? Like, is there anything in here I'm missing? So curating your algorithm that way is helpful. Getting the fuck off Facebook and just doing better internet research than, than that, or just following the recommendations on YouTube, which are so sticky, they're so goddamn hard to avoid, especially because it's gonna send up some hypernormal stuff where pretty soon you're just watching MMA or bloopers or something. Um, and then you're like, where did two hours go? A- and you're like, "Oh, I was doing internet research." And, um, so just being real about how messed up those algorithms are, you're like, "Okay, what is it I'm trying to get clarity on? What are the narratives and who are good... First, let's just Google who are the, who are the scientists, academics, thinkers, whatever, that are representing these well. Let me go read some articles, right? Then let me find who's critiquing those well." If I can get to the point where I can make each of the arguments as well as they can make it, and then I get a sense of why they disagree. Is it different data? Is it different values? Is it different models? Is it, um, disingenuousness? Like, why do I think that? Then I might be able to start to say, "Do I see a synthesis here? Were they each cherry-picking and there is a higher order kind of insight?" Um, very often it's, you know, it's looking at this cross-section and this cross-section of a cylinder and saying it's a circle and a rectangle and there's partial truth, and they're just t- too low a dimensional insight. They're hyper-focusing on a thing like, you know, the Founding Fathers were slaveholders and so the whole thing is illegitimate, and they built all of these institutions to justify slavery, so e- all of our institutions are built on supporting that, so there's institutional racism everywhere, and, and how can you possibly like s- say anything good about or quote these guys? And it's like, yeah. And the people...... who want a better world than all the racism and slavery, want a world that is aligned with the Declaration of Independence more than other articles of governance written. And, uh, civil rights were slow, but emerged out of some of the structures that were Hippocratic as fuck that emerged, and there was greatness in the nature of what happened in the country. So there was like evilness and greatness mixed together. And so I can make each of those partial narratives by themselves adamantly and talk past each other. But the reality is it's complex, right? Like som- some of the individual people involved were... I have a friend, Gilbert Morris, who's a professor on these topics, and he's like Benjamin... or he's like a, Jefferson was a great man and not a good man at all, and you have to hold both. Like what does, what does it mean to be able to hold both of those? 'Cause I can tell each of those stories on their own, and they're both bullshit partial stories. Um, so can I start to find a higher order synthesis than either of the cherry picked or partial stories? That's the thing I wanna start to look for, and then not just jump to artificial certainty that now I got it, I got the whole thing, right? I got something, and there's probably still lots of insights. So how do I stay oriented to continue to gain insights? So those few things of make sure that what you're being exposed to is the various different ideas. Make sure you're seeking to understand them and synthesize them. Curate your info environments to support that, and curate your friend circles to support that. Those are
- 44:12 – 45:05
Comfort in the Unknown
- DSDaniel Schmachtenberger
a few things.
- CWChris Williamson
There's a sentence in CrossFit that says, "Get comfortable being uncomfortable."
- DSDaniel Schmachtenberger
Yeah.
- CWChris Williamson
I guess here it's get comfortable with the unknown would be an equivalent.
- DSDaniel Schmachtenberger
And there is some... It's very easy to have uncomfort with the unknown. So they're related, right?
- CWChris Williamson
Yeah.
- DSDaniel Schmachtenberger
And the reason CrossFit says that is because comfort and growth don't happen in the same place. Um, and good sense making and high certainty don't happen in the same place.
- CWChris Williamson
Yeah, it seems to me that it's going to be effortful, you know? To do this, to undertake good sense making is going to require you to go through discomfort, to go through unknown, and to spend a lot more energy than just the limbic sort of reflex action. There's
- 45:05 – 51:15
The Lab-leak Hypothesis
- CWChris Williamson
a, a quote from last year. It was in The Times. I've got this newspaper clipping. Um, Matthew Syed, I think identified it. It's called compensatory control. He said, "When we feel uncertain, when randomness intrudes upon our lives, we respond by reintroducing order in some other way. Superstitions and conspiracy theories speak to this need. It is not easy to accept that important events are shaped by random forces. This is why, for some, it makes more sense to believe that we are threatened by the grand plans of malign scientists than the chance mutation of a silly little microbe." 15 months hence now with the lab leak hypothesis, this feels, uh, even more sort of nu- nuanced and interesting. But yeah, I, um, that compensatory-
- DSDaniel Schmachtenberger
Wait, that's like important. I wanna touch on that for a minute.
- CWChris Williamson
Yeah.
- DSDaniel Schmachtenberger
There was something that that writing did. I, I don't know, I don't remem- (clears throat) uh, hear who you said did it, so I'm saying this with no allegiance or anti-allegiance. It presented a thought about, uh, it presented a position on a polarized topic, right? That the conspiracy theories aren't true and this is just a, um... and there weren't mad scientists plotting and anything else, and that this was just a, a bug. It conflated that with a high moral, almost spiritual insight that everybody would naturally agree with and feel elevated by, which was that, um, we've all had the experience of feeling disorder and then seeking where we start cleaning the house where we had procrastinated when we, our taxes come in, we don't know how to pay it, and we wanna feel productive in some way or whatever it is. Like we've all experienced that thing. And so people are like, "Yeah, that's true." And then they're like, "No, it is true. We should just be able to embrace the uncertainty." So there's like a resonant true thing, there's like an aspirational thing, and then there's like a given, a conclusion on a topic that is not concluded, is there? That's a kind of narrative warfare, where it almost makes it seem like believing that belief is aligned with the high moral, almost poetic. Like there's both a good, there's the ethical, and there's almost a beautiful aesthetic, and then the true, right? So, so the best narrative warfare takes the true, the good, and the beautiful, distorts them all a little bit and braids them to align with a particular position. And that's how I feel when I'm reading like The New York Times or something where I'm like, if I believed anything other than this, I would just be a bad person. Like it's so clear th- what moral high ground is and right side of history, and it's written so beautifully. Like whoever wrote this, a fucking brilliant writer and, um, poetic, it's achingly beautiful or whatever. And, and it seems so clear because they're quoting the New England Journal of Medicine and Harvard and whatever it is, and it seems like the best scientists all agree and the peer reviewed journals said it. That doesn't make it true. Like it re- it doesn't ma- it doesn't mean that the morals that are there are like... So the lab leak hypothesis is a really great example. I have not done my research on it to have my own opinion on it adequately, so I am not going to, but I'm gonna look at it just from a narrative point of view. The lab leak hypothesis was up till whenever, a couple months ago, you know, like being a flat earther, right? Like a flat earther, anti-vaxxer, tinfoil hat wearing, reptilian baby blood drinkers run the world. Um, and you know, you, you have to believe all that nonsense. And it's like...But even more, it's like anyone who's saying that it could have leaked out of a lab not only doesn't understand science and is anti-science but they're trying to cause a war with China. They're xenophobic. They're against Asians. They're, like, like, all this moral sanctimony of what a bad person you are, what the bad (laughs) effects will be, and how dumb you are if you think that it's reasonable that it might have escaped, uh, a lab that happens to be in that area that happens to work with those vi- (laughs) viruses. And because the science is settled, because of something that we later came to realize was not settled science, um, and so you're like, "How the f-" And, and so then it starts to come out that actually wasn't settled science. And we're like, "How the fuck did the zeitgeist get that powerful that quickly that you were a dumb and horrible person for believing a thing 'cause the science was settled?" And the science was never settled, and everyone who believed the science was settled and everyone else is dumb and, and bad should be reflecting like, "What the fuck? I got captured. Like, I got captured. I was certain about something, and I didn't even read the article in Nature that proved that it was certainly a zoonotic hypothesis, and I'm not qualified, and I wouldn't have known how to do the rebuttal that came out later. But the guy in, in, in The New Yorker or whoever it was that wrote about seemed really certain, and, (laughs) and it, and it appealed to my sensibilities as..." And the institutions that agreed with it were the institutions that seem high-minded that I like to agree with. Um, so I, I hope people take seriously right now as an example regardless of where the virus actually came from that the lab leak hypothesis was not dumb. Whether it was true or not, it was not proven false, and it was not dumb in the way that was said in the narrative. And then, like, how did that narrative get that... Like, what was the force that wanted to make it seem that certain and to push against the other narrative so strong? That should be a, a very interesting question for everybody. And, yeah, it's fascinating. In this whole situation, I have seen zeitgeist formation that is more intense and faster than I've ever seen previously that is not based on good sense-making but other stuff.
- CWChris Williamson
Just rapid news cycle iteration and,
- 51:15 – 55:26
U-Turning Politicians
- CWChris Williamson
yeah, uh, uh, one of the, the terms that you use a lot is talking about good faith and bad faith actors. And, um, I guess that this ties in with sense-making individuals or actors, if you want to say that, sense-making agents. Um, what I find, especially over the last 15 months, and the lab leak hypothesis is a good identify because it was so flagrant and in your face, was people who had complete certainty plus powerful distribution to be able to convince others of their certainty are able to reverse their position essentially without an apology within the space of 15 months. That it's, here was a thing that I'm certain is true, and now here's another story about a potential other truth without referring to the fact that the first truth that we made you believe was true was untrue. We saw this with, um, Joe Biden last year where he said that shutting down travel from China was xenophobic in February-
- DSDaniel Schmachtenberger
Right.
- CWChris Williamson
... and then by May was saying that Donald Trump had left it too late to close the borders. Like, you don't get to do that. You do not get to fucking do that. You're supposed to be the people leading the country. You're supposed to be the ones that we hold to the highest levels of good faith actor requirements.
- DSDaniel Schmachtenberger
Yeah, that, that's cute. Um, (laughs) like, obviously they do get to do that because people are, uh, easy enough to capture and move along in that way. Like, that, that's (laughs) , that's why it happens. Um, y- what's interesting is each time, say, a narrative changes, we made a mistake before, we couldn't have known, or science takes time, which sometimes is true, but now we know, right? (laughs) Like, it's always, but now we know, so what... It's a continuous justification for the authority we have. And, um, the Consilience Project, we just published an article there recently, uh, called Where Arguments Come From. Uh, team that worked on it did a really, really good job, and it basically shows, like, where do arguments in the public sphere come from. Like, in order for a lot of people to have heard it, a lot of amplification of the message had to happen, which meant a lot of people who have the ability to amplify a message had to care about it, or some people that had the ability to amplify it had to do a lot of work, right, to make that happen. And so typically, s- there's a narrative that somebody who has some vested interest wants, right? So they have, like, a, a demand for a narrative because it'll create a demand for a thing in the population, and so they find a source of narrative supply. They find a academic or a think tank or whatever it is that already thinks that thing or thinks something close enough. So they don't have to get somebody to lie. They just up-regulate the narrative that currently wasn't... Like that person's been writing about that thing for 30 years, it never got any traction. Now it's everywhere because there's now an agenda (laughs) that it is useful for it that will up-regulate it where before it was swimming upstream. And so it's important to just really think about the mechanics of what allows a meme to propagate-... what, uh, allows a narrative or zeitgeist to propagate, and it not just allows it, what, what propagates it, right? There is energy involved in propagation. The energy has an interest. It's seeking ROI on that energy. And so this is why D.C. is filled full of think tanks that are intellectuals putting out public policy that is already predetermined in advance what the ideology is. They're doing it on every new topic that emerges, right? Um, yeah.
- CWChris Williamson
What do you think a good sensemaking agent is not? Or what are some of the most common pitfalls
- 55:26 – 1:01:30
Common Pitfalls in Sensemaking
- CWChris Williamson
that people have when trying to become one?
- DSDaniel Schmachtenberger
I mean, we've been talking about the pitfall of excessive certainty this whole time. And epistemic certainty is I know what is true. Excessive moral certainty is I know what is good, also called sanctimony, right? Those are both pretty big pitfalls. There is another one on the other side which is, I don't know and I don't care, is nihilism. What I find interesting is that most people will, or many people will flip from certainty to nihilism in like one step, um, where they're pretty certain of something. If they find out that that's not true, they're like, "I can't make sense of anything. I'm giving up. I'm gonna go watch TV," or whatever it is. And because the hard work of having to sit in, "I don't know," and I'm gonna work at it and I'm still not gonna know. I'm gonna work at it more and I'm still not gonna know for a long time. I have a bunch of friends that get really frustrated with me because they send me a thing and say, "What ha- What's your opinion on this thing?" Whatever it is, and it's like, something about a different narrative on early human civilization and hominid origins or whatever it is. And I'm like... They're like, "Did you read the paper?" I said, "Yeah. It was interesting." And they're like, "Well, what do you think?" I'm like, "It would take me hundreds of hours to start to have a sense of it." Like, I, I, I, uh... It's an interesting topic. I don't have the hundreds of hours to put into it. I'm not nihilistic in that I don't care. It just might not make it to the top of my stack. Um, and a lot of the things that are on the top of my stack, I also say, "I don't know, but I'm working on it," right? Um, so when we talk about sensemaking, being a good sensemaking agent, like what are we really... Like, if we try to make it simple and grounded, what we're saying is... I actually want to understand the world I live in as best I can, because I actually hold that life is meaningful, and I hold that my life could be meaningful, which means that my choices can be meaningful. And so I want them to be informed as well as they can be. Uh, if my choices are me acting on and in and with the world, I want to understand things about me and the world and about what those actions will do as best I can. 'Cause if my choices really matter, I don't want to believe that it's gonna go a certain way and I'm wrong, unnecessarily, and I don't... Like, I want to understand as much as possible 'cause I ac- 'cause it matters, right? Like, ultimately 'cause it matters, and I care. And then the hard part is saying w- it matters and I don't know, and I might still have choices to make, and I care. Right? It, it, it's easier to jump to, "I know" or "I don't care," because the "I don't know" and "I care" and it's consequential and it's moving is fucking hard, right? Like that's, it's scary, it can be heartbreaking. It can... But it's like there's an epistemic humility and an epistemic commitment at the same time. I don't know but I can progressively come to know better, right? Not all positions are equally good positions. Some of them have more error and some of them are more inclusive of more perspectives. So I can progressively come to know better so that my choices can be better informed so they can be more effective and more meaningful. But in order to do that, I'm gonna have to stay the course of seeking understanding for quite some time. Which means I'm n- gonna have to not prematurely come to think I already figured it out too s- too quickly. Or defer my sensemaking to someone else who thinks they figured it out.
- CWChris Williamson
Easy exits out of the discomfort.
- DSDaniel Schmachtenberger
(laughs)
- CWChris Williamson
It's deciding to give up halfway through the workout. It's sandbagging it so that you don't go to your maximum heart rate. Yeah, it's dropping the weight down so that the discomfort's a little bit less.
- DSDaniel Schmachtenberger
Yeah.
- CWChris Williamson
You know? The get uncomfortable with the unknown, I think, is a really good sort of overarching heuristic that you've got there. One of the things that was-
- DSDaniel Schmachtenberger
Ope, just a s- just one more way to say that. Get, get okay with the unknown. Like, I, there's a, there's an even more poetic and beautiful way to say it that I actually feel, and I think everybody feels if they drop in, is actually connect to your love of reality. If, if you, if you didn't care about... Like, if you didn't have a love for reality, you wouldn't care if it got hurt. You wouldn't care if you lost it. You wouldn't care... Like, the fear of loss is because there's something meaningful you don't wanna lose. The anger at anyone doing the wrong thing is 'cause they're harming something you care about. Like, care and love are the origin of all the other emotions, 'cause otherwise you would just be apathetic and not give any shits. Right? So ultimately, I give shits about things because I, there is a care and love about life, my life, others' lives, reality, that's real.So I- there is a love of reality that is at the basis of the meaningfulness of anything, and reality is mostly unknown to me. I know the tiniest fragment. So my love of reality and my love of the unknown, right? If I have a love of reality and it's mostly unknown, that means not just comfort with the unknown, but... And this is the awe at the mystery. This is the spiritual sense of faith and trust, whatever, right? It's actually extending the love of reality into the fact that most of it's unknown.
- CWChris Williamson
That's nicer. We'll take that one.
- 1:01:30 – 1:12:15
Fixing Consequential World Problems
- DSDaniel Schmachtenberger
(laughs)
- CWChris Williamson
(laughs) One of the things that I was very interested in is to try and work out at what stage of personal act- actualization people should start to serve the world at large. Should we sort our own shit first before trying to go and fix the world, and when do you know when you're ready, for want of a better term?
- DSDaniel Schmachtenberger
Yeah. Definitely total enlightenment before sweeping the kitchen. Um...
- CWChris Williamson
(laughs)
- DSDaniel Schmachtenberger
I, I joke because it shows how ludicrous it is to think that it is not both always, right? Um... If I am seeking to help the world and I have not learned what's going on in the world, I might be doing stuff that's totally not needed or not very useful. I'm trying to solve a problem, but I don't understand the upstream things that are causing the problem, so my actions will mostly be useless, right? So do I want to work on myself in this place just means work on my cognitive models and maps that I understand the issue well enough I can be helpful, right? Sometimes I care about the thing I want to help, but the first thing is, do I understand the problem well enough to be able to help, right? Sometimes that doesn't matter. There's trash on the beach. I'm gonna pick it up. Did I fix the issue of trash on the beach? No, I don't know who's putting the trash there, why, what cultural effects are causing that, so... But I still picked up the trash that day. Cool. That is not a comprehensive solution to pollution. It's a meaningful activity in the moment. But, um, to the degree I want a comprehensive solution to pollution, I have to start to understand the financial incentives to make throwaway plastics. I have to understand what it would take technologically to be able to make plastics that biodegraded. I have to understand the culture of why people do that here and they don't in Japan and how we could change the culture. There might be a number of ways I can come to understand it well enough, right? So I might want to work on my understanding of the- a problem before trying to help it so that I have a sense of how to really be effective. Um, particularly, the more complex the issue is, the more consequential it is and the more consequential my action is gonna be, right? Oh, we're gonna do a solar, uh, uh, remittance program where we... uh, that's not the right word, um, but where we reflect 20% of the sunlight out of the earth through geoengineering. Should probably be pretty fucking sure, pretty fucking sure that that's a good idea because it's pretty consequential, right? Or we're, we're gonna try and sequester CO₂ using these genetically modified plants that we've never planted at scale and don't know the biological effects of the modified organisms. Like, we should... More sense-making before choices that are really consequential. So that's one example of I want to work on my own cognitive maps of what is needed, what's going on, what would be effective enough that I have a sense of what to do. But then also there's a point at which there's no more research that will work. I need to field test the thing, try some stuff and be like, "Oh, it didn't work for reasons n- the lab would've never told me. I didn't realize that the locals don't even like that thing, or they don't believe it," or... And so there's a place where the application layer also ends up being part of the epistemology. It's the testing, right? Um, so that's one example. There's also the example of, well, what if I'm working on trying to help the world and it's not... The problem isn't a lack of cognitive development, it's a lack of certain kind of emotional healing and development where, um, that is affecting how I'm showing up. Well, let's say whatever wounding issues in childhood have me have an outsized need for credit-seeking because of not having ever felt loved enough, or enough, or only having felt good enough based on performance and credit attribution, whatever. Will I possibly mess up a project to ensure that I get the credit out of it where I'll undermine other people or sabotage or whatever it is, or emphasize to me getting credit more than the effect of the project where me doing work on not needing that as much because of healing whatever kind of place that is in me would actually make me a much better agent for change in the world? Yeah, that's like a real thing, right? That's a real thing where that kind of stuff messes stuff up. Or where I am trying to heal a particular issue in the world that I don't realize is unresolved wounds in me where I see something resonant out there, and when I heal the thing in me, I have a totally different assessment of it, right? Like I have, I have a totally different assessment because I really... I was trying to fix marriages in a particular way because of my trauma around my parents' divorce or whatever it was, and so I was seeing it through a traumatized lens and I didn't even realize it. I had this whole mission and nonprofit and whatever it is. So there are times where our own trauma will get projected on the world. This is why Lao Tzu said, "If you want to protect your feet from rocks, better to put on shoes than try to cover the world in leather," right? That, that idea of like... But that doesn't mean that any pain you feel looking at the world is just your pain. Like I think if someone was as healed and integrated as they could be and they see a factory farm...They would feel the empathetic pain of the pain of those animals. If they see hung- hungry kids, they feel an empathetic pain. If they didn't, there would be something wrong with them. They wouldn't be enlightened. They'd be sociopaths. Um, and this is why you see the Buddha crying, right? This is why you see the, The Passion of the Christ, is the idea of the enlightenment is not just, "Oh, I can see you suffering and doesn't do anything to me." It's, like, sociopathic enlightenment is not that interesting. It's ... But this is where it can be either way, right? Am I clear and I'm really feeling the pain of the other and feeling called to help? Or, is the, some pain somewhere else just triggering my pain, and then rather than face it in myself, I'm going to try to solve it in the world in a way that will always keep my sense-making and my effectiveness off, right? So, these are examples that people will give, people who understand. This is like if you do more work on yourself first. Or my own need for excessive certainty (laughs) 'cause of my uncomfort with the unknown that will make me do shit where I'm, I'm wrong but certain I'm right too often, right? These are ... And we c- I'm sure the listeners can generate a hundred more examples. (sighs) So, should I just do a bunch of psychotherapy and a bunch of zen meditation, and a bunch of, uh, study until I am second tier or third tier or whatever the fuck the developmental metric I want to look at is that means I am now whole enough and integrated enough that I can work in the world? No. That, that's ridiculous. Um, you can't even. Like, so many of the ways we learn about the problems is by engaging with them, and you couldn't only do it in study. And so many of the ways we learn about our issues is by engaging in the world and seeing, "Oh, I really did try to get too much ego credit there, and I'm reflecting on it, and I was an asshole, and, like, I need to work on that, and I wouldn't have seen it if, otherwise." Or, "Wow, that project failed and I was so certain. That's how I'm seeing my certainty issue," right? So as I heal and learn and grow, I can show up better in the world. But as I show up in the world, I also get to see those things if I'm looking for them. If I don't look for them, I'll always blame the world. Every failure was somebody else's fault, right? Every ... But if I'm looking for it, then I can see those things, and rather than just get crushed, "I'm a piece of shit. That's all there is to it," I can take it as, "Oh, this is, this was a s- some belief, trauma, pattern that created a self-fulfilling prophecy or whatever, but that I could shift," right? So I want to bring an empowerment where I will look at what in me was off. Not to just beat myself up and hate myself, not to pre- or not to pretend there was nothing in me off, but to be able to see it, look at it, work on it, and integrating it and growing past it. But similarly, uh, there's also this thing that we're showing up to the world with things that we're passionate about. It motivates our growth, because let's say I'm afraid of public speaking. Let's say I'm, like, catatonically afraid of public speaking. I can just avoid that forever and don't have to grow through it. But let's say I'm somebody like Jane Goodall or whatever, and I go and I'm working with the w- with the primates in the wild and I watch the poaching, right? And I'm so fucking broken by that, and it matters so much more than whether the people like me or not. I get up on stage and talk about it, and like, "We have to stop this poaching. We have to..." Because something bigger than me and my fear of public speaking is actually now moving me. And i- if when I get up on the stage to talk I'm still in the like, "Are people gonna like me or not like me?" place, I won't get over the fear. If I'm touched by something that is so much more important than that, I can actually transcend that, 'cause it's not about me. I'm talking about the topic, right? I'm talking about the issue. I don't even need to talk about it, give me the fuck off. Have somebody else talk about it. I'll just talk about it if there's nobody else who's doing it. So, I also find that, like, the, the hardest parts of our healing are hard, right? Like, we avoid those things for reasons. We don't notice them. They're in the shadow for reasons. And oftentimes, this is why, like, so many people only heal patterns when they have kids. We s- w- see this a lot is because there's something bigger than them for which they're willing to work, 'cause they're like, "Man, I'm fucking my kids up the way my parents fucked me up. I s- told myself I wasn't gonna do this. I'm repeating the same patterns. I see they're gonna get it." And that's the only thing that has them, like, double down on what it takes to shift that. So whether it's your kids or whether it's some other calling, there's a place where showing up to the world is actually the only thing that can make something more important than you, that can allow you to transcend the parts of self that were just too hard otherwise. So, what we want is a virtuous cycle between growing as a being and having who we are show up for what we care about, and where, as we show up for what we care about, it gives us insight about ourself, about the situation, and it gives us motivation. And as we grow and heal more as a person, we can show up for what we care about better.
- 1:12:15 – 1:27:47
Will Human Emotions Limit Civilisation’s Potential?
- DSDaniel Schmachtenberger
- CWChris Williamson
That's beautiful. Have you got any sense of whether you think on the whole, people tend to err more toward the side of showing up or, uh, more toward the side of working on themselves first? If you were to give most people a little bit of a, a push in one direction, would you say, consider the outside or consider the inside more?
- DSDaniel Schmachtenberger
They're just different groups of people. Right? You have a personal growth world, uh, and a kind of psychotherapy and healing world, and the s- call it Eastern enlightenment world that is very focused in that direction. You have a, uh, entrepreneurial and, um, activist and various types of, uh, action-oriented that's focused in that direction. Biases in both sides-
- CWChris Williamson
Yeah. Have you considered the potential that humans are just too at the mercy of our emotions or programming to be able to reach our civilizational potential? I was thinking about this when the most recent, uh, potential release about aliens was coming about. And I had a conversation with a friend saying, "I don't think the aliens could be even 10% more emotional than us." By whatever criteria you want to cause that, more emotionally reactive, because coordination would become so difficult if you were to turn it up to maybe 10 or 20% more, that you'd be able to achieve shit. Like, have you considered that? That we might just be kind of bouncing off a glass ceiling that the creatures that we are, are so self-limiting that no matter how much we try to transcend our own programming that we are sort of destined to fail?
- DSDaniel Schmachtenberger
Um... Yeah. So this is th- there, there's a common question of, that I've received many times of like, "Do I actually get human nature?" And, and I think we-
- CWChris Williamson
(laughing) No, that wasn't... No, that wasn't... I wasn't asking whether you get human nature or not. (laughs)
- DSDaniel Schmachtenberger
Well, they're related questions.
- CWChris Williamson
Okay.
- DSDaniel Schmachtenberger
Right? It's a related question.
- CWChris Williamson
Okay.
- DSDaniel Schmachtenberger
And it usually comes in the form of, "I am dubious of two things. I'm dubious that people can be, uh..." Or I'll say it another way. "I, I'm concerned that people are too irrational and too rivalrous to do anything like this emergent coordination shit you're talking about. Um, that the level of rationality and the level of anti-rivalry," right? "So like w- wisdom and compassion or whatever it is that would be needed don't seem to be well-demonstrated across the population anywhere. So what kind of Aquarian nonsense is this?" Um, and, uh, so let's address that. It's a real politic critique, right? Or concern. Am, am I asking the same thing you're asking?
- CWChris Williamson
Not far off. Yes. I mean, I, I wasn't accusing you directly. It was more abstract, but yes, yes, you're right.
- DSDaniel Schmachtenberger
Yeah. Have I ever, eh, considered?
- CWChris Williamson
Yeah.
- DSDaniel Schmachtenberger
Um, so... I'll tell you the, the first part of how I approach this. So in the same way, I'll actually use it as an example of when I was saying a good way to sense-making is to do dialectic. So everything on the nature versus nurture arguments and the range of what people thought nurture could do were topics I really wanted to see what were all of the, the thinking and what was the basis for the thinking. And were there any kind of axioms that were unquestioned or new possibilities that could change the landscape from even those ideas. So, uh, one thing is when we look at say how violent versus altruistic or, um, rational or whatever metric we want to look at and however we assess looking at it across a population to get some kind of distribution. Um, first I would say I'm extraordinarily dubious of pretty much all social science of this type for a bunch of reasons. Um, one is it almost all started post Industrial Revolution and much s- more recent than that, and almost all of human history that conditioned a- us, uh, genetically and otherwise was in tribal environments. And those are so different. And just like we're saying even the nature of Facebook, engaging with it changes the patterns of what we think, feel, believe, react to. Um, and there's m- tests that Facebook has run of like, we can make people more depressed and happier and believe different things just by putting what's in the al- changes in the algorithm a little bit, right? So... post-ubiquitous capitalism and ubiquitous industrialization and ubiquitous nuclear family homes and a bunch of things, none of which were natural to the human evolutionary environment, but their conditioning that kind of won, and so it became an ubiquitous conditioning. We do the social science then and then pretend that that's not conditioning and call it human nature because it's ubiquitous conditioning. That's silly, right? Um, and then there's so few indigenous people left or whatever, we can just make them statistical outliers, um, e- even though they have very, (laughs) very different patterns on a lot of those things. Also... So that's one reason that I'm very dubious of the social sciences. And this is even like when they're trying to do a good job, not like, like the nonsense social science that was, uh, reifying why whites were superior in the early US based on bad interpretations of Darwin and phrenology and stuff, right? Um, but you can see from that stuff how easily bias influences something as complex as social science, complex and consequential. Um, the other thing is that there are a lot of things about people's behavioral dispositions that change with development, and development is not factored. We don't f- factor levels of higher stages of human development post just be- becoming 18. Even though they're very real things, we just put it all together un- under a bell curve. But, you know, when you look at the work of Piaget and then the kind of neo-Piagetian educational philosophers who were looking at human development and childhood and it's very clear, there's neurologic development and corresponding, mm, change in fine motor skills and logic skills and verbal skills, et cetera. And we get to like 18, school's over, and then development ends, right? And that's a fully developed person. That's...Gibberish, right? Like, that's not a fully developed person. Wh- So, what is development beyond that? So, you have a bunch of people who have worked on higher stages of development, and... Zach Stein's a good colleague of mine, worked on that very heavily and looked at the work of Kohlberg and Graves and lots of people who've worked on that. But it's, like, the complexity of someone's cognitive model, the development of their, um, moral models, the development of their aesthetic, the development of their capacity to perspective take, perspective seek, and perspective synthesize. Those things keep developing, right? Or, or they can keep developing. One can do things to develop them. And then at those d- higher developmental capacities, there are different behavioral dispositions. And this is not just typologically. They're typologically left or right or whatever. This is, um... But they're... So, we could say if the society was supporting more development of that type, you would have a totally different bell curve, right? And... But that's not a topic that's usually factored. So, there's a bunch of things like this where I would say the social science all needs to take in some grains of salt. One thing I have looked at is on the traits that matter most to a civilization that would work well, I've looked for positive deviance, outliers of the statistical norms on the positive side, to see is there an upper b- Is, is the, is the upper boundary that we think of really the upper boundary, or are there places where what we think of as the upper boundary is the median, right? Like, it's quite different. And so if you... I mean, there's heaps of examples, but if you look at, like, through much of the last few thousand years across lots of different cultures, uh, and different geographies, have Jewish families raised better-educated kids than the people around them m- much of the time? Yeah, they have. And, and so is there... Are there cultural dispositions that can lead to, um, higher qualities of education and correspondingly different qualities of ways of being, right? Different types of dispositions. So, then you have that for a long time, the Jews all as a diaspora pretty much didn't defect on each other, right? And the way the Jewish law is structured, their... It's kind of like a formal logical system. So, they're also getting very good at how to be able to think in formal ways, which makes them good at... which is why they became good at, uh, science and finance and other things that were thinking in formal logic was relevant. It's a really important example 'cause you could say, "Well, if what Jewish culture gave in terms of, um, the development of education and rationality and non-defection on each other could happen across the whole population, would that change things?" Yeah, it really would. What about the Jainists? Do you have a religion where across a long period of time nobody hurts bugs or plants? Yeah, you do. How do... What about the violent kids in the society? What about the sociopaths? Across a huge population, the, the, the violence bell curve is completely different, right? You have extraordinarily low violence across the whole population. How can that be? Well, they're d- they're developed differently. Can you have a population where almost everyone is violent? Yeah, there's a few cultures where violence is ubiquitous, right? Um, and you can see in cultures where kids w- grow up as child soldiers, that you don't make it to adulthood without killing people. And so it's like the Janjaweed and the Jains are both possible in human nature depending upon conditioning. So, the idea that what we naturally are is the median of that is just gibberish. It's just not understanding how we create societal structures that create conditioning that support the societal structure. So, what if we had something that was conditioning non-violence and compassion more like Jains or Buddhists or Quakers and conditioning rationality more like Nordic bildum countries or Jews? And what if we had a few of those things and we brought them together into not just an educational but a cultural developmental system? Could we have... Is it within human nature, if rightly conditioned, to have higher potentials? Yeah, totally is within human nature if rightly conditioned to have higher potentials. What if in addition we created an economic system where we addressed perverse incentive? So, rather than the guy who externalizes the most harm, uh, to the environment makes the most money and then gets the most chicks and status and whatever, to actually all of the harms, the externalized harms are internalized to the cost. So, the guy who gets the most money is the one who does the most omni-benefit and no harm anywhere. F- Well, now there's no sociopathic niche to condition bad behavior and bad values in people, and doing the thing that's good for others ends up being good for you, actually is conditioning the values even from self-centeredness, right? It starts to bridge them in that way. Well, how do we make an economic system that rigorously internalizes externalities and addresses perverse incentive? That's a really deep question for changing what we would call human nature that isn't human nature. It's the nature of, of made-up human coordination systems, right? Does all property law... does all access to resource have to be at the level of individual private property? No. Can we do things that change that fundamentally? Uh, is every good fundamentally rivalrous 'cause it's scarce? No, digital goods made it very clear that you have things that are not only not scarce but anti-rivalrous. The more people that use them, the more valuable it becomes. But we still make them artificially scarce because of the artificially scarce dollars, 'cause of the artificially scarce materials economy. Can we make a materials economy that isn't artificially scarce by making it closed loop with enough energy to run it? Yeah, we can. So, the point is, do we see positive deviance of-You know, you, you look at a very wealthy population, old wealth families while they still have the integrity of how to do dynasty, or even just the kids going to the best prep schools in the US, right? And then who go to the best Ivy League schools and have all the best tutors. Do you have the same distribution of success in life of the kids coming out of Exeter and the kids coming out of an average public school? No, they're totally different. Well, what if everybody went to Exeter and had that corresponding life since they were little? Well, it would be totally different. Well, but we can't afford to do that. Yeah, so the, here's the thing, the idea of the dumb masses is class propaganda (laughs) because the upper class that has access to the things that develop them having more capacity is why they end up having more capacity, is a major part of why they have more capacity. And then the idea that some people like them need to be in positions to rule because the masses are too dumb is a self-ful- fulfilling prophecy because then we'll keep the masses dumb by not giving them better educational resources and other types of, you know, things that would create a, a difference there. So I, like, I actually think that the idea of the irrationality and the rivalrousness of the masses is one of the deepest parts of, like, propaganda zeitgeist of ruling classes forever 'cause it justifies the basis for rulership.
Episode duration: 2:04:41
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode R2sIG6l4uU0
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome