Modern Wisdom8 Impossible Thought Experiments - Cosmic Skeptic
EVERY SPOKEN WORD
155 min read · 30,627 words- 0:00 – 1:35
Intro
- AOAlex O'Connor
If somebody told me that they were about to launch every single nuclear weapon on the planet unless you kill an innocent person, would you kill the innocent person? The point of these questions is, in many ways, to demonstrate that there is no answer to these questions, or at least that if you have an answer, there's no way to really settle the question in your favor.
- CWChris Williamson
Alex O'Connor, welcome to the show.
- AOAlex O'Connor
Chris Williamson, thank you so much for having me again, yet again.
- CWChris Williamson
Thank you for joining me here in Austin. How e- e- ph- philosophy graduate-
- AOAlex O'Connor
Mm-hmm.
- CWChris Williamson
... YouTuber, podcaster, and now wake surfing extraordinaire.
- AOAlex O'Connor
Yeah, my wrists feel as though they're about to come off. Um, I- I've, I don't think I've ever used this particular muscle before.
- CWChris Williamson
(laughs)
- AOAlex O'Connor
but-
- CWChris Williamson
That's a lie. We both know that that's a lie.
- AOAlex O'Connor
(laughs) Yeah, well, certainly not, uh, certainly not this-
- CWChris Williamson
Yeah, that side.
- AOAlex O'Connor
... voraciously.
- CWChris Williamson
Um, yesterday, you turned up to a boat trip wearing pretty similar outfit to the one that you're in today.
- AOAlex O'Connor
Yes. I- I'm- I'm trying to live out the philosophy that there is no situation in which you can't wear something resembling a suit. Maybe-
- CWChris Williamson
(laughs)
- AOAlex O'Connor
... you have to leave the jacket at home, maybe, at a push. I was just a little bit warm. But shirt, chinos, it's about as casual as it gets.
- CWChris Williamson
You- you- you did good yesterday.
- AOAlex O'Connor
I can't believe you've shown up to a philosophical conversation wearing a T-shirt and shorts.
- CWChris Williamson
Well, I mean, this is my uniform, you know?
- AOAlex O'Connor
Yeah. Well- well, we're hopefully, by the end of this, we're gonna be much more philosophically entwined, and you'll begin to understand how fun it is to be a bit pretentious about these matters, which includes
- CWChris Williamson
To always dress formally?
- AOAlex O'Connor
... dress codes. Quite right.
- CWChris Williamson
Okay.
- 1:35 – 4:54
What’s the Point of Discussing Ethics?
- CWChris Williamson
Uh, so the last time that we spoke on the show, we were talking about some moral quandaries and some ethical dilemmas, and I really enjoyed that. I like the opportunity to do thought experiments. It means that people can have their brains fried at home, as well as me. Problem being that I am the one that is publicly the most stupid, right? When these questions get asked and you say, "Well, what do you, wha- wha- what do you mean by kindness?" And then I have to try and think of something. Uh, so go gentle with me today, is- is my request.
- AOAlex O'Connor
I'll try my best, but there's, uh, I mean, the implication that I'm any better is- is a mistaken one. I think the point of these questions is, in many ways, to demonstrate that there is no answer to these questions, or at least that if you have an answer, there's no way to really settle the question in your favor. We don't really have a better grasp of the good or the just city than did Plato and the ancients. We haven't really progressed very much. And so, it seems a bit, uh, futile to be discussing this kind of stuff. But as they say, the wise of every generation discover the same truths. These, if there is such thing as moral truth, it seems to be something that's out there and graspable by individuals as they go throughout their life. Uh, it's not gonna put you in a better place than any of your ancestors, but it will put you in a better place than you were, uh, yourself a f- a few years ago. So, they're still worth asking and answering to see what you think about these things, but don't expect to become a moral expert. There was a wonderful question, uh, on an exam paper for, uh, wh- when I was studying philosophy and theology. Um, I can't remem- I think it was on the ethics paper. It must have been. And I can't remember the exact wording, but the question was something like, "Does studying ethics make you an expert in ethics? Does it, or does it make you a better person," or something like this. "And if not, what's the point?" Because, of course, you can study ethical theories. You can have an answer to every single ethical query. But is that going to make you a better person? In many ways, it might make you a worse person-
- CWChris Williamson
What's your experience?
- AOAlex O'Connor
... becau- you're more able to rationalize yourself out of a moral obligation. You're able to get away with things just by convincing yourself somehow that they're ethical, because you've got all of these ethical, uh, theories running around in your head. My experience is that I- I have a, I have a much better understanding of my own moral intuitions. It feels like I kind of get to know myself when I study what's actually going on in my brain when I think something is wrong. There's a very distinct feeling in your brain. It's distinct from anxiety or sadness or anger. It's distinct from propositions like, "This chair exists." So, it's a very specific feeling. It's like trying to pin down what that is, what its nature is, what it's grasping at. Helps you to know yourself a bit better, but I don't think it helps you to act upon them any more strongly.
- CWChris Williamson
Daniel Kahneman, that wrote Thinking, Fast and Slow, got asked by Sam Harris on stage, "After all of these years learning about cognitive biases and how irrational the human mind is, has it made you any less prone to falling prey to these things?" Daniel thought about it for a second and he went, "No, not really." You know, okay, well, it seems like the people that spend a lot of their time thinking about this stuff understand it a lot better, but that it does seem to be, uh, very ingrained. Just going back to the, the wise of every generation discover the same truths, which is a great quote,
- 4:54 – 13:24
Rediscovering Ancient Truths
- CWChris Williamson
why do you think it is that we need to rediscover the same truths? Why is it that, uh, uh, in the same way as technology, you know, we're not rediscovering the wheel, we're improving upon the wheel and iterating on top of it. Wh- wh- why is it that it does seem a little bit like the same questions get asked and an unsatisfactory answer comes back?
- AOAlex O'Connor
That's an, that's an important question. Um, I- I don't think it's always the case that it's unsatisfactory. Some people are perfectly satisfied in the ethical conclusions that they come to. It's just that they're not going to be universally accepted. Um, uh, I suppose, uh, with- with something like the scientific method, uh, you have this process of learning. You have this- this building block that you have to teach to the next generation, and then they build upon it, and that's the idea. And every generation gets taught those building blocks a bit c- more quickly for that reason. You- you learn a bit more at high school level, because science has progressed that the people at PhD levels are- are going a bit further and putting a bit more icing on the cake. Ethics can't really do this. I mean, it can. People are doing PhDs all over the world in very specific, uh, as of yet, sort of undiscussed ethical dilemmas and qualms and areas. But-... realistically, it's because coming to terms with ethics is very much, uh, in many ways a- a- a journey of self-discovery, and so i- it's kind of like how everybody in each generation will need to learn, uh, how to, how to live with themselves in, in their lives, and people are kind of coming to terms with the person that they are, trying to figure out how best to live their life, trying to figure out how to be happy. This isn't something you can just learn and then teach to your children, it's something they have to discover for themselves. It seems to be something a lot more personal. It's informed by experience, it's informed by intuition. If you're talking to somebody about an ethical issue, you might find that their views on an ethical issue are almost entirely dependent upon the experiences that they've had. At the very least, they'll be heavily affected by it. If you're talking about the, the ethics of theft with somebody who used to be homeless and used to steal in order to feed their family, they're probably gonna have a different idea of, of w- what it means to thieve, the ethics of thievery, and this kind of thing. Of course, ideally it wouldn't be this way because we'd be able to detach ourselves from experience and think hyper-rationally about ethics. But this is one of the reasons I've come to think that ethic- I subscribe to a view called ethical emotivism, that ethics is just an expression of emotion, which I think ties in very well with this, uh, observation. And so i- if that's what's going on when, when we're doing ethics, then it makes sense that it's not something you can just teach to somebody else in a book. It's something that has to be lived.
- CWChris Williamson
Why is ethics an expression of emotions?
- AOAlex O'Connor
Well, there are, there are, there are lots of reasons for this. I mean, the, the emotivist position is really not so much a, a metaethical theory of, of what g- good itself is, but rather what's going on in someone's head when they say that something is good, or when they say that something is bad. It's, it's a famously difficult thing to define. Um, the, the most famous case for emotivism, it's, it's a theory that's actually kind of gone out of fashion, uh, of, of late. It was originally put forward by A. J. Ayer in the 20th century in a groundbreaking book called Language, Truth, and Logic, which stated that the only things that can be meaningful are those which are either empirically verifiable, you can prove them by observation, or things which are just analytically true, like that there are no married bachelors. That, that's just tautologically true. You don't need to go and observe every single bachelor to know that they're unmarried. These are the only things that can be meaningful. And someone came along, and well, it was Ayer himself preempting the objection, comes along and says, "Well, what about ethical claims? These seem to be meaningful. People seem to mean something when they say murder is wrong, but it's not empirically verifiable that murder is wrong. What are you observing? What is it about murder? Where is this quality of wrongness within an action? That doesn't really make sense." But it's also not analytically true. It doesn't follow logically from murder that murder is wrong. It's not a tautology, so it's meaningless, right? And Ayer thinks, "Well, maybe there's something else going on. Maybe when someone says murder is wrong, what they're really saying is something like, 'Boo murder,' crudely." So for Ayer, writing murder is wrong is basically the same to write- a- as writing murder followed by an exclamation mark and an angry emoji. It doesn't add any propositional content, uh, it's just an expression. And this is emotivism, but there are other forms of non-cognitivism, the idea that ethical statements are not cognitive, that they're not, they're not true or false in the way generally thought of, because of course emotions, expressions of emotions can't be true or false. Uh, some people think that rather than being something like "Boo murder," it's more like, "Don't murder," that when you say murder is wrong you're expressing something like, "Don't murder." That can't be true or false. Don't murder isn't true nor is it false. Likewise, "Boo murder" is not true or false. It doesn't have truth value, it- it's just an expression. This is the emotivist position, and I just think that it offers a better account of what's going on in people's heads w- if they pay attention to, to the, to the basis of their ethical intuitions. It seems just to be at root some form of expression. The rationalization that goes on, the kind of, "Well, premise, premise, conclusion. Well, if we accept this theory, this entails this conclusion, and this is..." You know, sure, there's all that going on, but it's all based upon an expression of emotion. Look at how much of, look at how so much of ethical decision making, or I should say, like the, the solving of dilemmas, if you take, like, the utilitarian position that the best thing to do is to maximize pleasure, famous criticism of utilitarianism is, well, "Would you be in favor of, uh, killing five, uh, sorry, killing one person to harvest their organs because you've got five other people who need each organ in order to survive?" They need an organ transplant, but no one's there to give it to them. But you've got this one guy who walks in and he's got all the organs, so we kill the one person and give it to the five. Of course, there are answers to this. There are answers to why this might be wrong even on utilitarianism, but the basic idea is like, well, you wouldn't do that, right? Like that would be, that would be terrible. That would be horrible. And so the theory must be wrong. But on what grounds are you saying that because utilitarianism commits us to the view that we should be killing one person to save five... so? Like why is that a criticism of utilitarianism? Well, it's only because when you hear that example, when you hear what it leads to, you have this feeling of kind of like, "No, don't do that. Boo that." Something, something like that. And you might think, "Well, no, actually, no, the reason I don't like that is because, uh, it will actually contribute to, like, fear in society because people will be scared that they're gonna have their organs..." It's like, okay, like, so? Like why don't you like that? Ultimately, I think it all breaks down to something, if you pay attention to the, to the nature of the feeling in your head, it's something that belongs in the category of emotion. It's just something like "Ew" or "Gross" or "No" or "Boo," something like this. It's not one of those things. Common misconception of emotivism. Ethical claims don't map onto emotions, like anxiety and sadness and boo, rather...... they just belong in that category. But its own, it's its own unique feeling. I think ethics, imagine we lived in a world, Chris, where there was no word for anxiety. Just didn't have a word for it. It's like, okay, so someone feels this thing, anxiety, and they're trying to describe what it is that they're feeling. They're like, "Well it's kind of like, it's kind of like be, it's kind of like excited but sad and, uh, like bad at the same time." It's like bad, exci- i- they'd be kind of dancing around. They're saying, "It's not quite this, but it's a bit like this, a bit like this." It's kind of somewhere in the middle. And there's this very unique feeling that we have towards, in, in the way that we might feel anxious towards public speaking or sad towards the death of a friend. There's this very unique feeling that we have towards seeing somebody steal from a homeless man. What is it? Well it's kind of like, ugh, it's kind of like disgust. It's kind of like boo. It's kind of like don't. It's kind, it's kind of a bit like that but it, but it's not any one of those things. It's somewhere in the middle and we don't have a word for it. Well, I just think that we do have a word for it and that word is wrong, but that it belongs in that category of thought. It's just, it's just within that, uh, that, that, that context and, and that's how we should understand ethical statements. But that's my view. The last time we
- 13:24 – 24:46
Is Ethics About Minimising Suffering?
- AOAlex O'Connor
spoke you seemed to think that ethics was, uh, essentially a project of minimizing suffering. I wonder if that's still an intuition that you hold to?
- CWChris Williamson
That was the intuition I've got and I feel like you're now making me commit to some ridiculous thought experiment that you're about to put in front of me and say, "Well, it's interesting because if we're going to try and minimize suffering then..." Dot, dot, dot.
- AOAlex O'Connor
Well the premise of our conversation, Chris, was that you wanted me to bring along some ethical quandaries for us to work through together and hopefully finally, after thousands of years, put to bed. Um, I just think it's a good way to, to start thinking about ethics, to look at some, some, you know, case examples.
- CWChris Williamson
Let's get into it.
- AOAlex O'Connor
Uh, that there's a wonderful, uh, if we take the, this kind of utilitarian position of minimizing suffering, and the reason it's good to start there is because at least in a secular context most people start here. Most people think ethics is something about minimizing suffering or maximizing pleasure that is the thing that is right is the action which minimizes suffering or maximizes pleasure, maybe these are the same thing. This is the utilitarian position, at least one version of it, the crude utilitarian. So there's a, there's a wonderful thought experiment, um, which comes from Roger Crisp who is, uh, who has a wonderful, um, r- review and analysis of John Stuart Mill's Utilitarianism. It's an, it's a, it's, if you're trying to read utilitarianism by Mill, it's worth reading Roger Crisp alongside him. It's just the best kind of introduction to it and, and it's the, it's the book that was set for all undergraduates to read at Oxford as well, um, philosophy undergraduates. He, he comes up with this example which he calls the rash doctor. Uh, so imagine for a second that, that there's, there's a doctor and there's a, a patient that he needs to treat that's in pretty dire need, maybe they're, they're, you know, on the, on the brink of death or something, and the doctor has two options. They're like pill A and pill B, the red pill and the blue... No. Pill A and pill B, the pill A is one where if administered it has a 99% chance of failure. Call it a 99.9% chance of failure. It's just gonna kill the patient in terrible agony. But it has a 0.1% chance of restoring him to perfect health. That's pill one. Pill B, pill two, whatever, B1, I don't care, is a pill that is basically the reverse such that it has a 99.9% chance of success but it will only restore the patient to, let's say, 95% health. So it's all pretty good, you know, like a perfectly livable life, just not quite 100% but, but nothing that would be complained about, uh, and it has a 0.1% chance of failure and killing them painlessly and, and agonizingly. Okay, so these are the two options. Now the doctor chooses the first pill, the one that's gonna have this, this overwhelming probability of agonizingly killing the man, but it works and it restores him to 100% health. Did the doctor do the right thing is the question?
- CWChris Williamson
In retrospect, it depends on whether you could have run the experiment again. Can we do this again? Can we see what would have happened? Problem being that you're never gonna actually get to split test this experiment and work out whether or not you would've killed him with pill B. It ends up being probabilistic utilitarianism.
- AOAlex O'Connor
Exactly. And so what we end up with is a situation... I mean, I think most people would want to say that the doctor should choose pill B. It, it seems clear to most people, again, might be a clash of intuitions, but if you have these two pills, 99% chance of certain death, 1% chance of 100% health, versus 99% chance of, of like 95 or even like 99% health and only a tiny slim chance of killing the patient, surely the second one is the right one to do. But of course if we say that the right thing to do is that which maximizes pleasure or that which minimizes suffering then we'd be committed to the view that if he chose pill B even if it works, if pill A would have worked, he did the wrong thing. Which seems weird. Seems like we wanna say that there's some sense in which you should choose the second, and as you quite rightly identify, we should be probabilistic about this. This is where you can distinguish between actualist utilitarianism and probabilistic utilitarianism. So maybe what we should do is not what actually maximizes pleasure but what will probably maximize pleasure. But then that's a little strange because if the reason why we're trying to maximize pleasure or minimize suffering is because we believe that there's just something about suffering, there's something, there's this moral truth of the universe that minimizing suffering is the right thing to do, w- w- like what right do we have to add this probabilistic qualifier to it except because it's kind of practically difficult to, to, to swallow that pill?... if you will. It's like, okay, well we, we're just kind of letting our, our practical considerations what, override our, our considerations about the very nature of good and pleasure itself. That doesn't seem quite right. Can we just say that what is good is what will probably maximize pleasure? That doesn't seem like a, a very steadfast sort of, uh, rule of the abstract universe.
- CWChris Williamson
Not very precise, is it?
- AOAlex O'Connor
So, maybe then we need to distinguish between the, the criterion of the good and the decision procedure. That is, the criterion of the, of, of right, the, the thing that makes something good is still what actually maximizes pleasure. It's just that the best way on the whole to achieve that is to adopt in our decision procedure, that is when deciding how to make, uh, ethical choices, to take a probabilistic approach. So, probably maximizing pleasure isn't the criterion of good, but it is the procedure that we use to make the decision about how to get to the criterion of good, which is still actual maximization of pleasure. But you can see already that it, it can't be as simple as kind of, "Well, the right thing to do is that which maximizes pleasure." I mean, there's a whole other problem with this, which is that, of course, if, if you want to be a utilitarian that, that kind of crudely decides in any situation the best thing to do is what's gonna maximize pleasure, even probably so, it means that every time you go to make a decision, you have to do this kind of moral calculus, a hedonic calculus, and figure out what's going to, what's going to have the best effect here. But what if it's the case that the act of doing the hedonic calculus is actually more harmful? Well, then you shouldn't do the hedonic calculus, but that means that you're kind of then not acting like a utilitarian.
- CWChris Williamson
What would be an example of that?
- AOAlex O'Connor
Well, just for instance, I mean, if you were kind of... I don't know, if, if, if you were... If you had to make a, a quick split ethical decision-
- CWChris Williamson
Like an EMT...
- AOAlex O'Connor
Sure.
- CWChris Williamson
... that's got some people by the side of the road and you need to walk out.
- AOAlex O'Connor
Yeah.
- CWChris Williamson
And by taking the time to do the hedonic calculus, more suffering is...
- AOAlex O'Connor
Yeah, like, y- y- maybe you don't even have time to do it there. It's like, if the right thing to do is always to kind of analyze the situation and see what's gonna maximize pleasure, in that situation, by doing the analysis, you run out of time and the patient's dead.
- CWChris Williamson
Yes.
- AOAlex O'Connor
Um, also maybe even in kind of mild cases, 'cause an interesting consequence of utilitarianism is that there's almost no such thing as an amoral action, because everything seems to have some minimal effect on pleasure and suffering. You know, the position that I'm sat in seems to at least minimally affect the pleasure I'm feeling, the suffering that I'm having, you know, the, my, my tone of voice towards you. Like, every single little minute decision seems to be something that has some minimal effect on pleasure and suffering. And so sure, you could, you could say that every single time I do anything, I'm gonna do a hedonic calculus, but then you basically become paralyzed. It becomes very difficult and slow to do absolutely anything. And so, in that second case, it's not, it's not like the first, where it's obvious that just doing the calculus makes you run out of time. In this case, like, there's nothing in principle stopping you from doing it. You could just try to make sure that every single time you do anything, you think carefully about the, the effect on pleasure. But, uh, that just doesn't seem like a very good way to live. It s- it seems like it's quite harmful to a person, and overall might actually be more annoying to people, might cause more suffering. And so... Okay, so what if it's the case that living as a utilitarian is wrong by a utilitarian standard, and should we be utilitarians? Well, if we should, then utilitarian, th- the utilitarian ethic dictates that we don't always live by utilitarianism in this way, so let's not. So, it kind of self-defeats. Now, there are more ways around this, of course. So, this is where you get something like rule utilitarianism, which says that the best way, uh, the, the, the way to kind of get to the good, the way to maximize pleasure is to adopt rules, which, if generally followed, will maximize pleasure. And so you come up with rules like, don't murder, don't thieve. And so even in a situation where it might actually minimize suffering for you to thieve in this particular situation, it's like because we've already abandoned this idea of judging every situation on its individual merits, we say, "We don't have time for that. We're just gonna, we're just gonna have a rule, and if we generally follow this rule, pleasure gets maximized." And so even in a situation where thieving, say, would actually minimize suffering, you should still not do it because of your allegiance to this rule. But then that seems weird too, right? Because now you've got a situation where you might well know, 'cause you've done the hedonic calculus, you've worked out that yeah, in this situation, if I were to, you know... A famous example might be if you're a sheriff of a town and there's an innocent man, but the whole town thinks that he's guilty and they're gonna cause an absolute riot and just burn the city down if he's set free, but you know that he's innocent. It's like, well, sending that man to jail even though he's innocent will minimize suffering in this instance, 'cause it prevents the whole city burning down. It's like, okay, what should we do here? Well, you might think-
- CWChris Williamson
There's a degree of injustice that's going on that feels like it's outside of this sort of utilitarian outcome.
- AOAlex O'Connor
Yes. So, justice is one of the biggest criticisms of, of utilitarianism. The idea that there seems to be this thing called justice. You have this right that is impenetrable even by the, the persuasive force of, of great deals of suffering. Even that has its limits, of course. Most people would say that you could, you know, kill an innocent person if it was gonna stop a nuclear war from going off that destroys the entire planet. But it's like, why? Did you believe in rights in the first place? If you actually believe that there are the- these things called rights which are genuinely inviolable, then you have to commit yourself to the view that that right can't be violated even in the situation where, you know, it, it's gonna prevent a disaster. And if you say, "Well, no, no, no. Then, uh, okay, in that situation, it'd be okay to violate the right," there was no right in the first place. It was just this rule that you made up that you think on the whole is gonna minimize pleasure, uh, m- minimize suffering, but you can see in an obvious case where this is definitely going to maximize suffering, you just violate the right.... it betrays this idea of a right as nothing more than one of the rules of rule utilitarianism.
- 24:46 – 28:09
Reductio Ad Absurdum
- AOAlex O'Connor
- CWChris Williamson
Is there a... Is there a way to try and have broad rules that, for the majority of cases are useful, um, and then ignore the outliers? So I know that the last time we spoke, you explained is that the reductio ad absurdum, where you try and do something which shows if you take this particular ethical theory to an extreme case, something kind of weird or bizarre happens. Uh, the entire town getting burned down to put one man in jail, as an example. Is there a way that people have tried, in ethics, to say, "Well, look. On average, most cases are going to fall within this bell curve of normalcy." There might be some outliers, but that doesn't necessarily disprove the fact that, overall, this seems to be an optimal way to do things.
- AOAlex O'Connor
Well, yes, of course there, there have been attempts, and the reductio ad absurdum is a useful approach, just means reduction to absurdity. It shows, well, let's take this logic, see where it goes, and see if it leads to absurdity. The reason why people do a reductio is because if you put forward this principle... The, the issue with something like utilitarianism is it's often put forward as an objective moral theory, that is, it is objectively true that this is what should be done. You could be kind of a subjective utilitarian that thinks that pleasure is only subjectively good, or something like this. Mill himself certainly thought it was objectively true, in the way that, uh, the fact that I can see a table is evidence that it's there. Um, or that, rather it's a little more subtle than that, but similarly, the fact that I desire pleasure proves that it's desirable, um, in the way that the table being visible proves that I can see it. So, he thinks that it's subjective, and the problem is if you have an objective theory, if you're saying this is just objectively true, even if you find a single example, no matter how convoluted, that proves it wrong or that makes us think it's wrong, it must be wrong. Imagine if we did the same thing with mathematics. Imagine if we had a mathematical formula. Indeed, in history, we've had this. We- we've got, like, Newtonian physics, is a fam- famous example. Very good, you know? Newton really had it going for him. Got us to the moon and back. But when it comes to, like, things moving closer to the speed of light or something like this, it doesn't work. Okay, so we could just kind of say, "Well, you know, Newton grav- gravity is kind of, it's, it's true enough, or it kinda works," and it is, in practice, in terms of its, its practical import. Yes, it's good enough to, to help us live, but if we're talking about the actual truth of it, it's not like, "Yeah, well, I guess it's, it's, it's kind of true." It's like, no, if, if something proves it wrong, it means it's false. If we, if we had a mathematical formula that we thought was true, but we showed an example of it punching out an actually obviously false answer, we wouldn't just be like, "Oh, well, it's right most of the time." We'd be like, "This is how science is done." You try to disprove a theory, and if you find one example of something disproving it, the theory must be wrong. And so if we're going to try to objectify, uh, e- ethics in this way, we have to hold it to the same standards. Doesn't matter how much of an outlier it is. If, if you've proven it wrong, you've proven it wrong.
- CWChris Williamson
Lay it on me. Give me, give me something that's gonna make me look stupid.
- AOAlex O'Connor
(laughs) Um, so here's a, here's a fun way to try to nail down the intuition as
- 28:09 – 41:50
Is it Right to Harm People for the Greater Good?
- AOAlex O'Connor
whether you believe, uh, whether you think like a utilitarian or whether you think like a rights-based deontologist, a person who believes that there are these things called rights that people have that are immune to amounts of suffering and pleasure. And again, we can get into that more, because, of course, we've kind of hinted at these examples. But the idea is that if the thing that matters is suffering and pleasure, then if harming an innocent person to save many people maximizes pleasure, then we should be able to do it. But people wanna say, "No, there's this thing called rights." So they seem to contradict each other. Mill tries to offer an analysis, a bit like what I just did, of saying that rights are actually kind of a construct that comes out of pleasure, but it's, it's generally thought that these, uh, contradict each other. So there's a, there's a wonderful, uh, question that I once approached on, on my YouTube channel, and I wanna read it word-for-word to give it credit to the place it came from, which is a, a wonderful quiz called Morality Play. Um, it's... I can't remember the website's name, but I'll send it to you and maybe you can link it in the description. But I think it's the first question. I'll read it word-for-word and see what you think of this. "You are able to help some people, but unfortunately, you can only do so by harming other people. The number of people harmed will always be 10% of those helped. When considering whether it is morally justified to help, does the actual number of people involved make any difference? For example, does it make a difference if you are helping 10 people by harming one person rather than helping 100,000 people by harming 10,000 people?" In other words, is the moral analysis the same? The proportions are exactly the same. You're always kind of, uh, you're always saving 10 times the amount of people that you're harming. But is there a moral difference in, let's say, killing one innocent person to save 10 and killing 10,000 to save 100,000? Even if you think both are wrong, even if you think both are justified, are they exactly equally justified?
- CWChris Williamson
Are they equally justified or wrong? So my intuition is that the bigger numbers feel more wrong. It feels like there's more overall suffering going on. I- i- if you were to say you have the choice between suffering for 10,000 people but pleasure for 100,000 versus suffering for 1 and pleasure for 10, the 1 for 10, to me, seems more acceptable than the bigger number.
- AOAlex O'Connor
You say there's more suffering going on, which there is, but there's also much more pleasure going on, that the point of the thought experiment is that these equally balance out. So in, in both cases, the balance of suffering and pleasure is precisely the same.
- CWChris Williamson
... proportionately.
- AOAlex O'Connor
Yeah, it's equally kind of canceled out, and so yeah, you increase the suffering a lot, but you also increase the pleasure, so that the balance remains exactly the same. So if what you care about is the minimization of suffering or- or the getting the- getting the best balance of pleasure over suffering, then these two ought to be the same, right?
- CWChris Williamson
(laughs) Yes, they should be, but I'm obviously logically inconsistent.
- AOAlex O'Connor
Well, you- maybe you feel like it's worse to harm more people to save more people.
- CWChris Williamson
I think we've had this discussion before that, um, the avoidance of suffering rather than the pursuit of pleasure is an interesting question, and also, is the pursuit of pleasure simply the absence of suffering?
- AOAlex O'Connor
Mm.
- CWChris Williamson
There seems to be a good bit of evidence that suggests that humans aren't actually pleasure seekers, they're suffering minimizers.
- AOAlex O'Connor
Yeah, so you can kind of reject the grammar of the question by saying that... uh, I mean, the implication in the question is that the proportions remain the same, but is it worse. Maybe you could just say the proportions don't remain the same. One way to- uh, to establish this conclusion is to say that, like David Benatar does, the famous antinatalist who- who wrote a wonderful book called Better Never to Have Been that has a wonderful discussion on the nature of suffering and pleasure and their asymmetry. It's not just an argument as to why you shouldn't have kids, it's- it contains a lot of wonderful reflections on these- on these topics. Uh, and he thinks that suffering just counts for more.
- CWChris Williamson
Yep.
- AOAlex O'Connor
Would you take five minutes of the worst suffering imaginable if afterwards you got five minutes of the greatest pleasure imaginable? Hard to say, but most people say no, it's kind of not worth it, in a way.
- CWChris Williamson
Do you remember when we went to the Life Lessons Festival a couple of years ago, and we were sat down... it was a canteen style thing at some disgustingly ugly building in the middle of London, and-
- AOAlex O'Connor
The Barbican.
- CWChris Williamson
That was it.
- AOAlex O'Connor
The dreaded Barbican.
- CWChris Williamson
Yeah, it... some sort of brutalist architectural nightmare. It had to be-
- AOAlex O'Connor
Objectively evil.
- CWChris Williamson
Y- uh, correct. Yeah, exactly. And, uh, we were sat down at the canteen, it was sort of a school tables style thing, long benches. We were having this exact discussion about, uh, the relative amounts of suffering and pleasure. It's before you became a full-on nihilist, but I think you would-
- AOAlex O'Connor
(laughs)
- CWChris Williamson
... it was a gateway drug to your nihilism. And, uh, people kept on coming and going. People would sit down next to us, recognize you or recognize me or- or- or just sit near us, and I don't know whether you noticed, but there was a- a- an increasing, um, sort of, like, no-go zone, that people sort of came, sat down, looked, didn't like the conversation, and then sort of shuffled away or just left very quickly.
- AOAlex O'Connor
Yeah, well, it can be morbidly fascinating, but it's also quite depressing to think about. Uh, Benatar's book, it is- is... uh, it- it... the first chapter, or maybe the second, uh, i- is arguing that even if your life is mostly pleasure and just a little bit of suffering, it's still not worth beginning, it's still not worth having a- bringing a child into existence that's- that's gonna have mostly pleasure and only a little bit of suffering. But then the next chapter, after convincing you of that conclusion or trying to, the next chapter is basically him saying, "But even so, your life definitely is way more suffering than pleasure and I'm about to prove it to you." It'sai- it's not the name of the chapter, but it- it's... the essence of it is why your life is going a lot worse than you- even you think it is, and he kind of puts this real emphasis on- on the suffering. Uh, and it does seem like maybe there's a- there's an imbalance here. It does seem that there- there's a- there's a difference in the way that we treat them. For example, uh, I can- I can unconsensually inflict suffering upon you if it's gonna prevent greater suffering. If you're like- if you're, like, unable to speak to me or something, there's something going on with your- with your communication, and I need to break your arm, 'cause if I don't, maybe you're like drowning in- in some small little, like, cave, and- and your- your- your arm's stuck and I need to break your arm to pull you out. I'm justified in doing that unconsensually because suffering for more suffering is fine. But if I could inflict suffering to bring about some great pleasure in your life, if I were to break your arm, and by doing so give you, like, an encyclopedic knowledge of philosophy that would just be perfect for your podcast, even if, like... I mean, even if in theory you might actually choose that if given the choice, if it's unconsensual, I don't have the right to do that in the same way. Why? Why can I inflict suffering unconsensually to prevent greater suffering, but I can't inflict suffering unconsensually to grant some really, really great pleasure that might even outweigh the suffering? It seems-
- CWChris Williamson
It seems-
- AOAlex O'Connor
... to be an imbalance here.
- CWChris Williamson
... it seems like this is because of the asymmetry between how we view pleasure and how we view suffering.
- AOAlex O'Connor
Quite. But of course, you can just adopt thought experiments... uh, adapt thought experiments. That's the- that's the wonderful thing about them. So if you think suffering counts for more, just imagine kind of whatever the proportions would look like, so just- just make it so that the balance is actually the same, so maybe it's like-
- CWChris Williamson
So it would be 1% instead of 10%.
- AOAlex O'Connor
Or may- maybe, yeah, maybe, like, killing one to save 10 is roughly the same as, like, I don't know, killing f- like, 500 to save 10,000, 'cause maybe you'd need to slightly adapt it to actually-
- CWChris Williamson
You get a discount when you s-
- 41:50 – 1:06:01
Moral Responsibility Vs Ability to Act Differently
- AOAlex O'Connor
uh, turn. I wanna get your, your views and, and calibrate your intuitions on the nature of, of the relationship between moral responsibility and the ability to have acted differently. Most people think, uh, that generally speaking, if you are to be held morally responsible for something, you need to have been able to act differently. Um, if, if you couldn't help but commit a particular action, then it's, it's difficult to hold you morally justified, uh, morally, uh-
- CWChris Williamson
Responsible.
- AOAlex O'Connor
... responsible for that action. Uh, there's a interesting example that's often given in the discussion of free will, uh, it's discussed by Sam Harris in his Free Will book, I know that for a fact, of... And this is a real case, so it was a man who was basically exhibiting pedophilic tendencies He was just sexually attracted to children. And I can't remember if he acted upon it or not. I think he may have done, and so he was-
- CWChris Williamson
He did with the nurses in the place where he was being held after a little while, after he submitted himself for psychological evaluation, psychia- psychiatric evaluation.
- AOAlex O'Connor
Yeah. Of course, when the evaluation is done, it's discovered that there's this great tumor that's pressing against the part of his brain that deals with impulse control.Okay, how does this change your moral assessment of this person? You might actually start feeling sorry for them, because it- it's as if I've kind of... If I were to, like, prod your brain, Chris, in such a way that gave you the same disposition, I'd be doing a great evil to you. You'd be a victim there. It's not, it's not your fault. It's like, okay, cool, um...
- CWChris Williamson
And the fact that there isn't a prodder in this situation makes it kind of unique, 'cause there's no first mover that you can point to, to say that they're the person that caused this to happen.
- AOAlex O'Connor
Yeah, so there's- there's no one, there's no one kind of to blame here, in the way that kind of, you know, somebody might be victimized by cancer, but there's- there's not really someone to blame for them developing it. It's just... But it, you know, it's- it's something that we, we feel sorry for people who un- undergo this. It's like, okay, so this- this person has a, uh, has a brain tumor that's basically turning them into a pedophile. Okay, do you feel sorry for this person? I mean, I- I seem to. I- I- I would say yeah. I mean, this is a horrible situation to be in. And they remove the tumor and their disposition goes away. And then a little while later-
- CWChris Williamson
It comes back again. (laughs)
- AOAlex O'Connor
...the disposition starts coming back. He starts getting a bit pedo again, a bit noncey, and then they discover that the tumor's come back. Okay, so we say this person, uh, let's start with kind of the- the- the attractive quality, the- the- the fact that this person has this- this sexual attraction. Certainly, it's not their fault that they, that they were this way, 'cause they had this thing in their brain that was, like, causing them to think this, that was through no control of their own. But of course, this is just how sexual attraction works anyway. It's just this thing in your brain that makes you feel a particular way that you can't control. You don't get to choose what you're attracted to.
- CWChris Williamson
The fact that your inhibitions have been lowered-
- AOAlex O'Connor
Yes.
- CWChris Williamson
...is, it's just a gradation, right? It's not a difference of type. It's simply a difference of kind. Presumably, everybody has some degree of inhibition and some degree of sexual attraction-
- AOAlex O'Connor
Mm-hmm.
- CWChris Williamson
...and th- those are pointed in different directions. I feel like it was his, uh, stepdaughter that was the young girl that he- he- he'd sort of made some movements towards. So, he annihilated his own marriage, torpedoed his own marriage to this lady who- who had this daughter, uh, maybe twice, in fact. I feel like she took him back after the first tumor, and then it happened again, and then she let him go, but it was- it was two. So, I mean, even that's interesting. Is there a number of times that your non-conscious inhibition reduction should be allowed by a person?
- AOAlex O'Connor
Well, this is where I think it's very useful to adopt something like, uh, Susan Wolf's real self view. That is, the difference between determining whether somebody is, like, responsible... And we- we remove the moral element, because we kind of remove freedom here. In fact, we should get there in a second, because we should- we should make sure kind of everyone's at the- at the same point here. It's like, okay, so we can probably agree that if somebody finds themselves sexually attracted to children, you can't hold them morally responsible for the sexual attraction. It- it, like, you can't choose what you're attracted to. If anything, you feel sorry for such people.
- CWChris Williamson
There's a great study that, uh, a neuroscientist did, where he got, um, straight men, straight women, gay men, gay women, people that are attracted to kids, people that are attracted to animals, put them all into an MRI and, uh, put an arousal response meter around them, which is a- a basically a cock ring for men and like, a- a moisture meter or something for women. Uh, put them in there and showed everybody every different type of attractive image and- and- and video that they could, and you would think that, especially in a situation like this, that people that were attracted to kids would maybe try and sort of change, th- they would adapt what was going on, maybe out of embarrassment or something like that. Uh, and it turns out that you can show them absolutely everything under the sun, and they don't get any response. And this is, this is just generally a fascinating intuition when it comes to people that are attracted to kids, that I asked him, "Do people get to choose what they're attracted to?" He said, "No." You go, okay, that makes the moral judgment of people who have the attraction, not act on it, people who have the attraction, makes it a fascinating thought experiment.
- AOAlex O'Connor
Mm-hmm.
- CWChris Williamson
It's like one of the most interesting thought experiments, I think.
- AOAlex O'Connor
Yeah, I mean-
- CWChris Williamson
I think there is.
- AOAlex O'Connor
...most people have generally accepted in, uh, in- in other contexts that sexual desire is- is amoral. You- you can't be held responsible for a mere sexual desire. You can only be he- he- be held responsible for acting upon it. Most people, most people think this. Um, it's one of the- the great arguments, uh, in the- in the history of the- the liberation of homosexuals, has been pressing the point that you don't get to choose to be this way. Of course, you know, with homosexuality, it's a bit easier, 'cause you could say, "Well, even if you did, there's still no problem with it." It's like, yeah, even if you did, even if it were possible to choose to be a homosexual, it wouldn't be wrong to do so. Um, but one of the, one of the great points that's pressed is that, but it's even worse, because you, it's not like you get to choose this. So most people have accepted that in most- most contexts. But thinking along these terms, we- we- we begin to think about, like, inhibition. And so let's think about somebody who commits a moral crime, and let's imagine a similar kind of situation. Let's- let's say that whatever the, whatever the moral crime may be, take something that you think is immoral, and they're committing it because there's a brain tumor that is pressing against the part of their brain that deals with inhibition. It's now not just affecting something like attraction or desire or whatever. It's affecting their actual ability to suppress the id, to suppress the basic, uh, sort of pleasure-seeking, crude, animalistic tendencies that they have. It's like, okay, is that person responsible? If I were to go into your brain personally and like, on purpose-... like, remove a part of your brain or, or press against a part of your brain in such a way that I knew would make that part of your brain that, that, that holds you in moral reprimand with yourself stop working, such that next time you wanted to do something immoral, you, your brain just literally wouldn't be capable of stopping you, because I'd j- I'd just removed that part of your brain. It's just not there. Then you go and commit that immorality. It's my fault. It's not yours. You're a victim there. But again, if you find someone who just naturally doesn't have, uh, i- isn't able to control their inhibitions, it's just the way that their brain is, the way that they were brought up, which they had no control over, it's their genetics, maybe they got a brain injury that's undiscovered or something, who knows, but whatever it is, if somebody's not hardworking, if somebody's lazy, it's like, okay, what if just their brain is just designed in such a way that they can't overcome that laziness? Can they be held morally responsible for this kind of stuff? In the same way that we wanna say that if there was a tumor pressing, uh, a- against the part of your brain that dealt with kind of your a- a- attractions, we, we'd say you're not responsible for them, and that gives us reason to think that, you know, generally speaking, you can't be held responsible for attraction. It's like, if a tumor pressing, uh, uh, against the part of your brain that dealt with your inhibitions made you act in particular ways, then if we, if we said that, we can say the same thing. We can say, well, that's because it's, the only reason you're acting this way is because of something going on in your brain over which you have no control. It's this, it's this tumor thing that's doing something in your brain. It's like, that's how all action works all the time.
- CWChris Williamson
Tuna, tumor. (laughs) Tuna or not, if you've got tuna in your brain, uh, whether you have a tumor or not, at what point does a tumor become a tumor?
- AOAlex O'Connor
Exactly. Something...
- CWChris Williamson
At what point are you-
- AOAlex O'Connor
Yeah.
- CWChris Williamson
... morally responsible for the way that your brain is made up? You didn't choose your parents. You didn't choose to be born at the time that you did. You didn't choose any of the upbringings that you have.
- AOAlex O'Connor
Yeah, and of course, so this isn't quite a moral dilemma as much as it's just an argument against the existence of free will. It's like, any decision that you make seems to be a result of brain activity over which ultimately you have no control. I, I press this quite strongly. I think that there just is no free will in a- any actions. But at least in kind of moral, the, the moral arena, people should be able to see that there's, there's a problem here. Like, nobody strictly chooses to be lazy. They just kind of are. Or maybe they kind of choose to be lazy in that they, they act in particular ways that, that bring about that character, but why did they choose to act in those particular ways? Uh, th- their brain just kind of was of a psychological constitution that made them do that. And so, if we're gonna accept this in the, in the language of desire, why not in the language of action as well? Why not in the arena of action? How can we say that anybody's actually morally responsible for anything at all?
- CWChris Williamson
Wasn't there a guy who had a, uh, uh, he noticed himself getting very aggressive, went up a bell tower, started shooting people after he'd shot his wife and kids and then finally shot himself, but shot himself in a way that didn't destroy his brain because he said that he needed the post-mortem to have a look. He could tell that something was wrong. Uh, this was Sam Harris as well. And you'd think, "Well, I mean, that's morally reprehensible." But then you go in and find that there's a tumor the size of a golf ball in his brain, which was pressing on the amygdala and making him incredibly angry and incredibly aggressive and stuff. There is something very, very odd when you think about, uh, was it wrong to kill those people? Yes. Is that person responsible for killing those people? Uh, k- uh, kind of, yes. I- i- are there gradations of responsibility? Yes, it seems like there are. It seems like he is somehow less responsible than a version of him that did that without the tumor. We go, "Okay, well, what if, in a different version of this universe, that guy had a worse upbringing, or a more aggressive father, still no tumor, does aggressive father guy get more of a pass somehow? Is he less culpable?" Should we mediate the sentences that criminals are given based on the past that they have?
- AOAlex O'Connor
Well, well, sometimes people, people think that we should, right? Like, you, you begin to feel a bit more, or, uh, uh, at, at any rate, you, you wouldn't be surprised to see, like, a lawyer, like, a, a defense attorney arguing in defense of someone who's committed a crime and plead guilty, saying, "You know, this person has a horrible life, terrible upbringing-"
- CWChris Williamson
It's the context.
- 1:06:01 – 1:17:28
Unfairness of Imbalances in Wealth & Intelligence
- AOAlex O'Connor
is next? Um... Okay. There's- there's something else we can ask, uh, that- that's along the same lines. Let- let's talk a bit more about this, this idea of kind of, uh, things that you have that you- you aren't really responsible for. Um... Do you think it's unfair that people will get admitted to good colleges, like Harvard or Oxford and Cambridge, because they have a lot of money in their family? That is...
- CWChris Williamson
Yes.
- AOAlex O'Connor
Yeah. The- they're able to pay for, like, top-tier tuition that other people don't get. They're able to pay to go to private schools. They basically pay their way into these colleges.
- CWChris Williamson
So, you're talking about money being used to give the person that is applying a step ahead, not money that is being used to get in the side door by backhanding it to some administrator?
- AOAlex O'Connor
That's right.
- CWChris Williamson
Right, yeah. Uh, still, my intuition is yes.
- AOAlex O'Connor
Seems- seems unfair. It seems like maybe it's not the kind of unfairness that we could kind of like legally rectify because that would have a wealth of implications for, you know, uh, individual liberty and, uh, property and spending of money and this kind of thing. But it seems at least sort of undesirable that people should be able to, should be able to do that, because, of course, they didn't choose to be born with a lot of money. And so, yeah, they get this wonderful tutoring. They get a- a big house with no worries and a stable family, and they're able to study in peace and calm while their maid does the washing up for them, whatever. It's like, this is unfair, that for that reason, they end up going to Harvard. For that reason, they get a high-paying job, and the cycle just continues.... okay? Why then is it more fair for somebody to get into Harvard or to Oxford or to Cambridge because they're clever? That person didn't choose to be born with, you know, a high IQ. They didn't choose to be born in a situation, eh, eh, eh, in, with the kind of upbringing that made them interested, that made them want to read. They didn't choose the desires they had and the interests. They didn't choose to take an interest in physics when they were, like, seven. They just did. But by doing so, they end up going to Harvard, and by doing so, they end up getting a well-paid job. And so, why is one more fair than the other? We wanna say that sort of paying your way into college this way is unfair, because they're getting in because of the amount of money they have, not because of their merit, not because of how intelligent they are. But why is it any more fair to let someone in based on their merit or intelligence?
- CWChris Williamson
Well, when you realize that merit and intelligence is an endowment genetically as opposed to an endowment financially, go, "Okay. There's, it feels like there's..." Uh, I think the intuition is there's something more selfie about the, uh, the, the brains and the conscientiousness. Again, also wildly heritable. Um, I, I had this discussion a little while ago, it was very interesting, talking with a behavioral geneticist who is, uh, quite progressive, so she's on the left side of the aisle. Behavioral genetics has been very much adopted by people that are on the right. Uh, and I was asking, "How do you square this circle between wanting, um, equal access to opportunity, even perhaps a little bit of equity as well, in terms of equitable outcomes, when you know that people are entering this race in different ways, huh?" If you were to flatten out all of the opportunity equalities, if you were to get to a stage where everybody's, uh, level of preparedness was exactly the same, what you're left with, the differences that occur are now genetic. I mean, that's even more brutal. You're saying to somebody, "The reason that you didn't get into college-
- AOAlex O'Connor
Yes.
- CWChris Williamson
... is exclusively because your parents have shitty genetics."
- AOAlex O'Connor
Quite.
- CWChris Williamson
They had the same diet growing up, they had the same access to sunlight, they were given the same priming, all of that stuff. So, okay, we flatten society down to give everybody equal opportunity of, of, of, uh, learning and, and, and food and nutrition and stuff. You okay? That leaves all differences now down to nature. How ugly is that to look at? That doesn't seem like a very fair world.
- AOAlex O'Connor
Yeah. It, it kinda makes it almost even worse. I mean, if you don't get into college and someone says, "Well, hey, man, like, you know, it, it's, uh, it's, it's your upbringing. It's because y- it's because you didn't have the money. Like, if you'd have had a bit more money, better resources, maybe you could've made it," it's like, "Oh, man, that sucks." But, okay, but if someone says, like, "Hey, man, you didn't get into Harvard and there's nothing you could've done about that. You are just too stupid," that seems so much worse.
- CWChris Williamson
Yeah. You're not wired for this.
- AOAlex O'Connor
Yeah. And so, like, the... (laughs) In other words, when, when somebody says, you know, "This person paid their way in, you know, they didn't get in, in because of their merit, because of their intelligence," it's like, why are you saying that with an angry face? Why, why is that any better or worse? It's like, well, okay, maybe a college's job is just to pick those who are just actually, you know, best suited for study, because we want people in the, the high-paying jobs that are gonna be efficient, because it's gonna, you know, help the economy or whatever, so, like, fine. But people who pay themselves into good tuition are, by the time they get to college age, like, better suited to study a course. They're gonna get better grades. They're going to do the job better. It's like, there, there seems to be, like, an unfairness about this. Sure, there's an unfairness about this, but there's an unfairness about the intelligence that either you're born with, if it is genetic, or if it's not genetic, even the kind of upbringing and surroundings that you have and the interests that you develop, you don't get to control this kind of stuff. So, how is that any more fair? So, in other words, if we're gonna criticize one, why don't we criticize the other? It's another slight ethical qualm. And there are different interpretations you can take of this. You could either say, "Okay, then maybe it just is unfair to let people into college based on their level of aptitude and so we should just be basically letting anybody into college for any reason," which is kind of the approach that's being taken, not by, like, most universities, like, individually, like the, like the Harvards and the Oxfords of the world, they still have interview processes and they're still selective, but there are so many universities now that basically anybody can go to a university if they want to. And this most people think is a good thing. It's like, yes, uh, it, it kind of doesn't matter where you're at, doesn't matter what you wanna study, doesn't matter how intelligent you are, it do- doesn't matter how apt you are, like, there's gonna be a course that's gonna be suitable for you and you're gonna be able to go and have this experience. This is great and it seems, it seems kind of more fair. So you can, you can take this approach of saying, "Well, yeah, maybe we should just extend that logic and say that the Harvards and the Oxfords of the world need to abolish the interview process and basically just select people at random or something."
- CWChris Williamson
Tyler Cowen literally said this the other day. He said that he feels like the selection process isn't necessarily selecting for things that are, are useful long-term. Um, that you're selecting for people who have got great rote memorization, but also are incredibly orderly.
- AOAlex O'Connor
Mm.
- CWChris Williamson
They're also the sort of people that will stick to rules. And, uh, he is a disruptor, uh, a- and he wants to see more people that would break the rules, that would probably struggle with homework, that would not-
- AOAlex O'Connor
Yeah.
- CWChris Williamson
... turn up on time, that would do a lot of the things that universities want to see. But the problem being that that undermines the person's ability to perform in the university and what you end up with is trying to change the definition of a university. It's like, what, what are the outcomes that we want here? He wants to have people that are prepared to move humanity forward, and, and businesses in interesting and novel ways. You go, "Well, maybe that's not the job of a university." So, maybe he needs something which is not a university now to prepare people to do that. You go, "Well, maybe you're trying to retrofit an existing establishment to now cr-..."... produce people in a different way, to the way that you actually-
- AOAlex O'Connor
Yeah.
- CWChris Williamson
... want them to be.
- AOAlex O'Connor
Crucial. I mean, the, the, the d- the defining factor here is going to be, what is a university? Like, what's it for? Is it, uh, is it like a societal tool to make people apt for jobs? Is it, uh, a tool to give people a, a life experience? Is it, uh, like, is it a qualification machine? Like, what is it? Is it, is it something for kind of private individuals who are interested in academia to go and find somebody to study under for the, for the sheer love of knowledge? Well, that seems to be how they started, but that's not what they are anymore. They seem a lot more kind of institutionalized, embedded into society as a whole. Like, that's why we have governmental student loan schemes because the government sees universities as, like, uh, an important part of society, when they were founded originally kind of as a, just a way for people to, to learn stuff, just for the, for the sake of it. Like, the, the nature of university is always changing, and maybe we're just beginning to see that change. But of course, that's only one response to this problem. You could just go the other way as well and say, "Well then, yeah, if it's fair to admit people based on their merit, it's fair to admit people based on, you know, how much money they've had to, to put into private tuition," and, and things like this. And if how much somebody, uh... The problem with doing that, 'cause put in those terms it doesn't seem so bad. It's like, yeah, okay, uh, uh, a college should have the right to choose someone who's most apt, and sure it's a bit unfair that someone's paid for tuition, but they've still been, you know, they've, they've still had the teaching and they are actually better now, so maybe it makes sense for the college to accept them. The problem with that view is that, of course, uh, the amount of money that somebody has can become a reliable statistical indicator of how apt somebody is. If you've got a bunch of applicants and you don't have time to interview everybody, you could think, well, those with more money probably paid for private tuition, probably gonna be more apt, and so a college would have warrant to start accepting people without even in- even interviewing them, just on the basis of their bank account. That seems problematic. But why? If you can't do it on the basis of bank account, why can you do it on the basis of intelligence? I- if we get into this idea of, of saying that you're actually equally responsible for either, you are as responsible for the amount of money that you have in your bank account when you're born as you are for the intelligence that you have when you're born, there seems to be a bit of a problem here. There's a, there's a book by, uh, the Harvard professor Michael Sandel called The Tyranny of Merit, which seeks to undermine the entire idea of meritocracy on these grounds. It says like, yeah sure, we don't live in a meritocracy, but, but so much of politics is basically saying we, you know, this is aristocracy or this is people buying their way into politics and the implication is like, we want meritocracy. You know? This isn't, this isn't meritocratic, it's ar- aristocratic. And, and Michael Sandel's like, "Okay cool, so let's, like, imagine a perfectly meritocratic society that everyone wants." It's like, it's actually still got a bunch of problems. It's still completely unfair. It's still just as morally arbitrary as aristocracy is. If there's something wrong with aristocracy, there's something w- wrong with meritocracy as well. It's a, it's a pretty kind of depressing revelation, but it's, it's a book that's worth reading if people are interested in this line of thought because we haven't really gone deep enough. There are probably people listening thinking like, "What the, like, what are you talking about? What are you talking about? Like, the amount of money that you have being, like, the, like having the same justification as how, like, intelligent you are, so getting into college." It's like, I hope most people can understand why we're getting at that, but if you wanna go further, The Tyranny of Merit will, will help to explain it. I also had him on my podcast, so if people wanna go and listen to that, then they're welcome to do so as well.
Episode duration: 1:35:12
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode _qU-v01Ulm4
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome