Paul Bloom on Empathy
In 2016 psychologist Paul Bloom wrote a book titled Against Empathy: The Case for Rational Compassion (a naming decision he still wrestles with). In the book, as in his career and in this Social Science Bites podcast, Bloom deconstructs what is popularly meant by empathy. “Everybody seems to have their own notion,” he tells interviewer David Edmonds, “and that’s totally fine, but we end up talking past each other unless we’re clear about it.” And so he outlines several widely used definitions — think compassion, for example — before offering several more scholarly ways of viewing empathy, such as “cognitive empathy” and “emotional empathy.”
A key to understanding his work is that Bloom is not actually against empathy, at least not in general, even though he tells Edmonds, “I think empathy is — in some way — a great cause for our worst behavior.” But the use of what he terms “emotional empathy” concerns him because, as he explains, it’s not evenly distributed or applied, and thus allows harm to occur under the guise of benevolence. “Empathy is sort of vulnerable to all the biases you would think about. This includes the traditional in-group, out-group biases — race, nationality, religion. It includes attractiveness — it’s easier to feel empathic for somebody who’s cute versus someone who’s ugly.”
Bloom and Edmonds also discuss how empathy leaches into the realm of artificial intelligence, where what might be judged empathetic responses from AIs can devolve into a humanity-extracting feedback loop.
In his work as a professor of psychology at the University of Toronto, and as the Brooks and Suzanne Ragen Professor Emeritus of Psychology at Yale University, Bloom studies how children and adults make sense of the world, with, as his website notes, “special focus on pleasure, morality, religion, fiction, and art.” He is editor of the journal Behavioral and Brain Sciences, and has written a number of public-facing books, including 2016’s Against Empathy, Psych: The Story of the Human Mind, and The Sweet Spot: The Pleasures of Suffering and the Search for Meaning.
To download an MP3 of this podcast, right-click this link and save. The transcript of the conversation appears below.
David Edmonds: If I have empathy, if I can put myself in the shoes of others, feel their pain, surely that will make me kinder, more compassionate. But Paul Bloom, a psychologist at the University of Toronto and best-selling author, has reservations about empathy. Recently, he’s begun thinking about the implications of his views for the world of AI.
Paul Bloom, welcome to Social Science Bites.
Paul Bloom: I’m a huge fan of your podcast. I’m delighted to be here.
David Edmonds: We’re going to talk today about empathy, and perhaps also at the end, empathy and AI. But can we start by just getting a quick definition of what empathy is?
Paul Bloom: I think we better. Everybody seems to have their own notion, and that’s totally fine, but we end up talking past each other unless we’re clear about it. I see sort of three main meanings. Some people use empathy just to mean kindness, caring for people. You know, I show empathy meaning I value other people. And in that sense of empathy, I think the world could use a lot more of it.
A second sense of empathy is what psychologists sometimes call cognitive empathy, and that’s the ability to suss out what somebody is thinking or feeling. And this sort of empathy is kind of morally neutral. If I’m a nice guy and want to make you happy when I buy you a nice present, cognitive empathy is really good. If I’m a bully or a sadist, cognitive empathy could be used for terrible means. If I know what makes you tick, I could cause you pain.
The sort of empathy I’m most interested in, most critical of, is what sometimes is called emotional empathy. I put myself in your skin, I feel what you feel. What Adam Smith calls sympathy. So, putting yourself in another person’s shoes, that’s a different sense of empathy. And I think that’s in some way the most interesting one, because it’s the most controversial.
David Edmonds: OK, so before we focus on that type of empathy, just to be clear, the difference between cognitive empathy and the kind of empathy that you referred to just now is that one is understanding somebody’s feelings and the other is feeling somebody’s feelings. Is that right?
Paul Bloom: That’s exactly right. You certainly see how you could have cognitive empathy without having emotional empathy. If I’m a bully and I torment my victims, and it make them very unhappy and ashamed of themselves and feel terrible, that’s cognitive empathy. I might know how to do it, what button to press, but I might lack emotional empathy where their pain doesn’t resonate to me. I don’t feel their pain in the slightest. In fact, I just feel pleasure watching them squirm around. One way to think about certain sorts of psychopaths, con men, seducers and so on, is high cognitive empathy, low emotional empathy.
David Edmonds: Right. And what’s the neuroscience of this? Is it possible to put people in an MRI scanner and actually observe empathy?
Paul Bloom: You can put people in MRI scanners or whatever you’re looking for these days! Yes, cognitive empathy and emotional empathy have overlapping parts of the brain. Maybe not surprising, because normally, if I’m listening to you and caring about you, I’m trying to figure out what’s going on in your head, and if I’m a normal person feeling some of what you’re feeling. They have distinctive neural signatures, as does compassion, caring, the sort of first sort of empathy that has its own distinctive neural signature. I don’t want to overstate it and say we could put somebody in a scanner and tell what sort of empathy they’re feeling. But I don’t think that that’s far from the truth.
David Edmonds: And is this something we’re born with? Is it innate, or is it something we’re taught?
Paul Bloom: I think the foundations for all of this are hard-wired. I think they’re built-in, they’re part of our evolution as highly social, highly moral species. But, of course, it has enormous amounts of development. Take cognitive empathy, which is where psychologists have studied the most developmental work. Even a 1-year-old, I think, has some rudimentary understanding that people around him or her have minds, have feelings.
But of course, the ability to figure out somebody’s motives, to understand the nuances of their thoughts, is something which undergoes long development, and even in adults, some people are much better at it than others.
David Edmonds: What about demographic gaps? Is there a difference, I’m thinking, particularly in gender and sex?
Paul Bloom: I think for all the sorts of empathy, women tend to have somewhat more than men. They’re better at social intelligence, a sort of cognitive empathy. They tend to be more compassionate, and they tend to show higher levels of emotional empathy. And this is a real and robust difference, but as often with these psychological differences between genders, you’re not going to find that unless you look at like 100 people. So, it’s real, but it’s fairly subtle. One way of putting it is there’s a lot of men who are very high empathy and a lot of women who are very low.
David Edmonds: Right. And any sense whether that is nature or nurture?
Paul Bloom: The reflexive answer to that is always “yes.” It is some combination of both. But I take it you’re asking whether I think it has an innate component, and I think it does. I think the different evolutionary pressures on males and females in our species and others leave women more predisposed for nurturance, because they are, in most of history, the primary caregivers, and they have to take care of babies. And men tend to require more aggression and more violence. And I think aggression and violence often involves a diminishing and shutting down of empathy. You can’t beat me to death with a club if you’re busy feeling my pain.
David Edmonds: What about how one measures this? Obviously, there’s a scale, something more empathetic than others. Is there a scientific way of judging degrees of empathy?
Paul Bloom: There are many scales. They don’t tend to be very good. The popular ones tend to blur together these different sorts of empathy. So they tend to sort of ask you questions like, How much do you care about other people, which is, you know, compassion. They ask you questions like, how good are you at sussing out people with cognitive empathy? And they ask you questions like, if you see somebody crying, do you start to tear up yourself? Which is emotional empathy. And often they don’t do a good job of pulling them apart.
When I was at Yale with two then graduate students, Matt Jordan and Dorsa Amir, we created our own scale that’s supposed to just look at emotional empathy separate from everything else. But the truth is, the science of measuring this is not strong.
David Edmonds: You’re famously against empathy, at least the feeling empathy that we talked about earlier, rather than cognitive empathy. That sounds like being against motherhood and apple pie. What could be wrong with empathy?
Paul Bloom: Against kittens? I wrote a book called Against Empathy, and I often wonder whether giving it that title was the best thing I’ve ever done or the worst thing I’ve ever done. It’s had both positive and negative effects on my career. I’m only interested in emotional empathy in my negative claim. So I think compassion is great. And I also think understanding what goes on in other people’s heads can be used for evil, but if you’re going to be a good person, you better understand other people. You understand what makes them tick. Otherwise you’re not a very effective good person.
It’s the emotional empathy I’m struggling with, and I’m against it, not in general. So for instance, it could be a great source of pleasure. You know, we both have children. One of the joys of having a child is to sort of experience something you’ve experienced a thousand times before, for the first time, all over again. That’s great. Emotional empathy, I think, is a great ingredient for fictional pleasures. For a while you can be Tony Soprano. I think this plays a role in sports. It plays a role in sex. I think it’s just great overall. I would not blot it out.
But I think it’s terrible from a standpoint of morality. The main reason is that our emotional empathy zooms us in on the suffering of those people who are like us, who look like us, who speak our language, who are our friends, who are our family — and is famously indifferent for other people. And this leads to huge distortions. In our laboratory studies in this, but also a lot of real world examples, where I’ll let a hundred strangers die horribly to save somebody I feel this enormous empathic pull for. And something which I wrote about when Against Empathy came out, but is more and more, I think, part of contemporary politics is the use of identifiable victims, people who suffer, as a way to spur aggression and hatred and xenophobia. I think empathy is in some way a great cause for our worst behavior.
David Edmonds: This is not just in group out group in terms of ethnicity or religion. It’s about distance as well. It’s about the fact we can’t empathize with people on the other side of the world.
Paul Bloom: Absolutely. So empathy is sort of vulnerable to all the biases you would think about. This includes the traditional in-group, out-group biases — race, nationality, religion. It includes attractiveness — it’s easier to feel empathic for somebody who’s cute versus someone who’s ugly. And of course, it includes distance, although there’s some debate in the field as to whether it’s distance per se or anonymity. But in any case, I kind of know intellectually that my neighbor-across-the-street’s life has just the same value as the life of somebody in far off, you know, Africa. But at a gut level, it’s easy to feel empathy for my neighbor. I could see him. There he is, you know, he waves to me. And somebody who is just an anonymous stranger, at an emotional level, they don’t count at all.
I think part of the moral program of a good person is to make them count, and so shut down your empathy a bit and try to work without it.
David Edmonds: Right. So I was going to ask you about what the implications of this are, because you talked about all the upside of empathy, and it sounds like there’s a very serious downside, but it means we get morality wrong. We become misguided about what needs doing. So where do we end up? How do we temper empathy or direct it in the right way?
Paul Bloom: It’s not an easy question. Some people I respect a lot, “say, look, empathy isn’t perfect, but it’s only game in town. If you don’t have these emotional pulls, you’re not going to do anything for anybody.” I think that that’s too pessimistic an analysis of human nature. You and I are both very familiar with effective altruists, and those are people who devote their lives to doing the most good, even if the most good involves helping people who they don’t see, who they don’t really care about, antecedently. Even involves helping animals, even unadorable animals. I think effective altruists are an existence proof that this can be done.
Now you could say, “Well, there aren’t many of them.” But we’ve managed to sort of make racism not respectable. And I say, “Oh, OK, I have maybe a bias towards people not of my skin color, but I shouldn’t have it. I should override it.” I think maybe we could do the same thing with empathic pull of adorable people and close people and so on. That’s my hope, at least.
David Edmonds: Let me tell you something weird about those people, though they are judged worse than those people who have a surfeit of empathy and who allocate their charitable works or their money towards causes that they are much closer to. In other words, the people who are able to step outside those various biases are judged more harshly than those who have empathy and have the biases that you object to.
Paul Bloom: I have a colleague at Toronto, George Newman. He has a study on tainted altruism, and he finds you take two guys, one guy just sits on his ass and does nothing, and another person goes and helps people, but he helps people in kind of a rationalist, maximizing way. When asked, people say, “I like the guy who sits on the sofa all day.”
David Edmonds: Right.
Paul Bloom: This is not an optimistic view of humanity; I accept that as a problem.
David Edmonds: I want to move on to empathy and AI, but let me ask you a linking question. We’ve been talking about, how you and I empathize with others. Do we need people to empathize with us? Looking at it the other way, how important is it for human beings to have other people feel our feelings?
Paul Bloom: I think in the context of intimate relationships, we want empathy. Now, what kind of empathy we want is sort of complicated, and I’ve been very interested in. It’s kind of a subtle question. We definitely want people care about us. I want my wife, my children, my friends to care about me. We want to be understood. Do we want emotional empathy? I think sometimes we do. An example, which I actually got from my wife, if she’s really mad at somebody, it’s not sufficient if we say, “Oh, I love you. I hope you thrive.” It’s not sufficient if we say, “I understand you’re very mad at this person.” She wants me to get mad, too. She wants me to share the anger. And because I’m an emotionally stunted Canadian, I sort of someday say, “Well, I understand you. I get why you’re mad. Stop being mad.”
But I think sometimes some people want their feelings to be shared.
David Edmonds: I can certainly understand your viewpoint being an emotionally stunted Englishman. So when it comes to AI, obviously, some people find it difficult to make friends. Some people can’t afford psychotherapists. We all get old and we become lonely because we can’t get out of the house. Is there a role for AI there? In terms of companionship? Is there a role for AI there? And might that satisfy us?
Paul Bloom: I think so. My view here is unsettled, and I’m trying to make sense of it, but I think there are serious problems with establishing relationships with AI, and I want to get to them, but before we do, I think there are people who are just suffering. They are lonely, sometimes they are old, they have nobody around them, and they are miserable. And I think that denying people access to a machine that can make their suffering go away is cruel. And I’m not talking about a tiny proportion of population. There’s a lot of very old people in the world, and a lot of them have nobody. And a lot of them have nobody because they may have dementia, because they may have personality disorders, they are very difficult to deal with. And not everybody has, you know, a million dollars to hire somebody to hang out with you. And not everybody has family devoted. For these people, AI could be a godsend.
And I’m going to go on to disapprove of it, but if you said your primary emotional relationship is with your dog, I would disapprove just a little bit. “Yeah, maybe. What about a person? Come on.” But I wouldn’t take away a lonely person’s dog.
David Edmonds: So it’s better than nothing?
Paul Bloom: It’s better than nothing. That’s a much more succinct way of putting it. It’s better than nothing.
David Edmonds: I can see how it’s so easy to anthropomorphize AI. I do Duolingo — learning German — and I have to talk to this avatar called Lily. I have to do it every day to keep my streak, and sometimes I don’t want to do it, and I’m really cross with Lily, and I’m grumpy with her. And at some level, I know it’s just AI, and yet I find it very difficult to overcome that I have a kind of relationship with her, and as I say, I can be in a bad mood with her.
Paul Bloom: I’m one of these many people who say “please” and “thank you” to ChatGPT, even though apparently it uses up like a million gallons of water when everybody’s saying “please” and “thank you.” I say. “Your answer is sort of inadequate, but I know you tried.” And I think as they get better and better, and they’re already very good. Now they have voices, and the voices are getting better, soon, I think I’ll be able to look at an AI, and it’ll be this charming person, like I’m looking at you, and then it’s going to be irresistible. And this brings us to the dangers of that.
David Edmonds: OK, so you said it was better than nothing.
Paul Bloom: Yes.
David Edmonds: Why does it fall short of ideal?
Paul Bloom: I think that there’s two sorts of worries. One is sort of practical. One, I wrote a New Yorker article about this, about loneliness, which is that AIs could cure loneliness, but it’s not necessarily a good thing to cure loneliness. Loneliness, for a young person, for a person who has social contacts, loneliness is a signal that you’re messing up. It’s the signal that you should do better: Get out of the house, go talk to people, listen to them. Be nice to them, be charming. It’s very painful. As an awkward adolescent, I remember the pain of not connecting with people, but it’s a good pain. Taking away that pain, I would have never become the wonderful bon vivant you were talking to today. (My friends are listening to this and saying, “No, he’s still a difficult, awkward person.”) But the nightmare scenario for me is that you give the future versions of ChatGPT become available for young people who find them irresistible, and as a result, never come to realize that sometimes they should apologize, sometimes they should listen, sometimes they should shut up, because the sycophantic nature of AI is everything you do is fantastic. It’s hugely dangerous, in a sense.
That’s one objection. The second objection is more. I don’t know. I’ll say it to you, a philosopher, philosophical, metaphysical. I have two grown sons. If I was to learn that one of them has abandoned normal relationships with people and spends all of his time his deepest sexual romantic partners are AI, I would feel ashamed for them. I would feel that’s a terrible way to lead a life — you’re in love with your phone. Go find a person.
David Edmonds: This is like Robert Nozick’s experience …
Paul Bloom: Yes
David Edmonds: Robert Nozick has this machine that you can plug into which gives you the experience of reality, but it’s not real. And the objection is, well, it might make you happy, but it’s not authentic.
Paul Bloom: That’s right, you have imagined they get Lily is better. You’re fluent in German, you develop this deep and abiding relationship with Lily. You come incredibly closer and everything. And you say, “Well, what’s wrong with that? Why are you prejudiced against the AI?” And the source of my prejudice is right now, these are not conscious beings. It’s as if you’re having a dream. There’s nothing meaningful, real, about it. Now, there’s nothing wrong with a pleasant dream, and there’s nothing wrong with doing something for fun, but if it became a central part of your life, I’d say you’re living your life wrong. And I know that’s very judgmental, but I think it’s judgmental in the right way.
David Edmonds: The first objection, which included it might be too sycophantic. Well, there’s an easy fix to that, a technical fix to that you just get AI to scold you every now and again, to mirror human interactions in a more genuine way.
Paul Bloom: It is, from a technical point of view, I think a fairly easy fix. But there’s a bit of a paradox here. I work a lot with ChatGPT, and I notice how as giving me comments on my work, and it was all genius, and it’s wonderful. This isn’t good for me. I’m not getting good comments. So you could change instructions, internal instructions. There’s a word for that, but you go in to say before you respond, read this list of commands. And my commands were, “I don’t need to be praised.” “Don’t praise me.” “Be honest.” “It’s important for me to learn where I go wrong.” And this is sort of saying, you know, like, “Boy, this work is not very interesting. It’s dull and derivative.” And I kind of changed it back. I think in the long term, we benefit from AIs that push back, just the way we benefit from friends and husbands and wives who push back. But in the short term, I think we want a rush of validation. In some ways, a similar appeal to things like drugs and pornography, which is we might recognize they’re bad for us in the long run, but in short run, the buzz they give us may prove irresistible.
David Edmonds: Paul, let me just finish with one final personal question, which is, how you assess your own levels of empathy.
Paul Bloom: It may be surprising for people who read my book, but in some way, the book was an exercise in self-help. I am, as people who know me will tell you, overly empathic, I tend to get really upset at the pain of others, and I tend to have a very local morality where, honestly, I find it hard to motivate myself to help people who are starving across the world, but I’ll do a lot for a friend or a student. And I actually think intellectually, I should be more fair, but I am emotional and empathic in that way, and that keeps me from doing it. So, in fact, the book is almost written for myself, saying, “Do better.”
David Edmonds: Paul Bloom, thank you very much indeed.
Paul Bloom: Thank you for having me.
