Carol Tavris – Mistakes Were Made

August 3, 2007

Carol Tavris is a social psychologist, lecturer, and writer whose books include Anger and The Mismeasure of Woman. She has written on psychological topics for the Los Angeles Times, the New York Times, Scientific American, Skeptical Inquiry, and many other publications. A Fellow of the American Psychological Association and the Association for Psychological Science, and a member of the editorial board of Psychological Science in the Public Interest, she is also a fellow of the Committee for Skeptical Inquiry. Her new book is Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, co-authored with Elliot Aronson, one of the most distinguished social psychologists in the world.

In this wide-ranging discussion with D.J. Grothe, Carol Tavris explains “cognitive dissonance,” and how it can lead to self-deception and self-justification. She talks about the ways that reducing dissonance leads to real-world negative effects in the areas of politics, law, criminal justice, and in interpersonal relationships. She also explores what dissonance theory says about confronting those who hold discredited beliefs, what dissonance theory may say about religious and paranormal belief, and the role of the scientific temper in avoiding the pitfalls of cognitive dissonance.

This is point of inquiry for Friday, August 3rd, 2007. 

Welcome to Point of inquiry, I’m DJ Grothe a point of inquiry is the radio show, the podcast of the Center for Inquiry, a think tank advancing science reason and secular values in public affairs. Before we get to this week’s guest, here’s a word from this week’s sponsor, Skeptical Inquirer magazine. 

Where can you turn to find others like yourself who appreciate critical thinking? Turned to Skeptical Inquirer, the magazine that separates fact from media myth. It’s published by the Committee for Skeptical Inquiry. Find out what genuine science has to say about the extraordinary and the unexplained. You’ll be surprised. Subscribe to skeptical inquiry today. One year, six challenging issues for 1999 to subscribe to request a sample issue. Just call one 800 six three four one six one zero or visit the point of inquiry website point of inquiry dot org. 

My guest this week is Carole Tavaris, a social psychologist, a lecturer, a writer whose books include Anger and the Miss Measure of Woman. She’s written on psychological topics for the L.A. Times, The New York Times, Scientific American Skeptical Inquirer magazine and many other publications. A fellow of the American Psychological Association and the Association for Psychological Science and a member of the Editorial Board of Psychological Science in the Public Interest. She’s also a fellow of the Center for Inquiries Committee for Skeptical Inquiry. Her new book is Mistakes Were Made But Not By Me. Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts. It was coauthored with Elliot Aaronson, who is one of the most distinguished social psychologists in the world. Welcome to Point of Inquiry. Carol. 

Tavaris, I’m very happy to be here today. 

Professor Tavaris, as a magician, one of my big interests is the topic of deception and self-deception. I’m talking about things like charlatans out there who’ve come to believe that they actually are legitimate. You know, psychics who begin to believe their own spiel are faith healers who talked themselves into believing that they’re really healing people. Your book in a more general way, touches on some of these psychological dynamics going on in our heads that lets us get away with stuff like that, lets us live with ourselves. Your books about the topic of dissonance theory, the theory of cognitive dissonance. Let’s start off by you’re telling me what that is. 

Cognitive dissonance is that these days of discomfort we feel when one thing we believe is contradicted by evidence or when we do something that is inconsistent with our view of ourselves or when we hold to ideas that are mutually contradictory. Every smoker knows this problem. If you smoke and you know it could kill you, that’s being in a state of dissonance. And you have to do one of two things. You have to dismiss, have to ignore the evidence that smoking could kill you or you have to stop smoking. So dissonance is a motivational state. It’s really uncomfortable. It’s the discomfort you feel when something you believe really deeply. Suddenly somebody comes up and says, you know, this belief you have in how to raise your children. Let me say this fabulous new study showing that you’re wrong. What is it that most people do? They don’t say, oh, thank you for this wonderful new study showing me that I’m wrong. They say, oh, piss off and take your stupid study with you. So what science does. The reason that science is so important and interesting and contrary to the way the human mind is designed, is that it forces us into dissonance. Science is the mechanism that requires us to test our beliefs against the evidence. And if the evidence doesn’t support what we believe, we have to change our beliefs. But in general, that’s not how the mind is designed. The brain, we now understand, is pretty much wired to look for information that confirms what we believe and to ignore or forget or overlook information. Disconfirms Letton believe when? You know, when we look at the behavior of people like that we all think are hypocrites or frauds or charlatans, as you just said? We say, how can they live with themselves? How can we live with the knowledge of the scam they’re perpetrating on somebody? And sometimes scam artists know exactly what they’re doing. They don’t they don’t have any dissonance to resolve. They think people are just easy marks and why not take advantage of them? But for the most part, the problem is the well-meaning people who have a deeply embedded belief that they have psychic powers. They can tell where where the body is buried. Their way of processing information is to exclude all evidence indicating that they’re wrong. That’s how they can sleep at night. 

So you say it’s especially uncomfortable for us. If those ideas causing this cognitive dissonance are about who we are as a person. This cognitive dissonance about one’s self image is what’s leading to the self-deception. You’re saying no one wants to admit. Maybe they’re even hard wired or unable to admit how bad they actually are. 

Well, it’s more like this when there are two different. I believe, you know, I smoke and I know it’s bad for me that that’s bad enough, but when the dissonance is between your belief that you are a good, smart, competent person and the fact that you have just done something stupid, mean, cruel or incompetent. Now you have to reduce that dissonance in one of two ways. What most people do is justify the bad, foolish, stupid, cruel, harmful behavior in order to preserve their belief that they are good, smart and competent. That’s how self-justification works. And that’s why it can over time and the aftermath of a decision. Lead us into disaster, because with every step we make justifying an action we’ve taken or a belief, we hold a decision we’ve made, we roll along justifying that. And then it becomes harder and harder for us to say, boy, was I wrong. Boy, did I really screw up here and change direction. 

Professor, before we get to bigger social questions regarding cognitive dissonance and self-deception, I want to ask about how all this plays out in my normal day to day world. Some of the happiest people I know are people who, sure, they might be going around offending other people, even getting told off regularly when they do. But when they get told off, they respond by saying something like, oh, that guy who just told me off he must be having a bad day. In other words, they don’t actually accept responsibility for their actions when they upset people. They make it the other person’s fault, not their own. On the other hand, I know really miserable people, absolutely miserable human beings, I’m sorry to say, who seem excellent at accepting responsibility for all their negative actions. They’re so good at it that they actually kind of seem to go out of their way to feel bad about who they are. So here’s the question. Isn’t it better to be the person who’s happy go lucky, even if you’re a little ignorant about how much harm you’re actually doing in the world? 

That’s a terrific question. And you’ve got two issues kind of smuggled together here. One is the question of taking responsibility for a mistake. It’s not that that makes you happy or not happy. There are people who are both parsers and blamers who never take responsibility just in the way that you describe. And it makes them happy because they don’t have to take responsibility and they don’t have to feel bad. And then there are people who we test dissonance not by justifying what they did, but by blaming themselves. That’s really at the heart of your question. Instead of saying, well, I, I, I had to hit that person’s head to insult that person. He deserved that. He started it. It’s his fault. Yes, it’s entirely my fault. Everything is my fault. In fact, everything I do in the world is my fault. Then you bet that will make people unhappy. And there certainly are people who reduce this dissonance by blaming themselves too much. They go the wrong direction, let’s say. So the goal here is to understand that when we make any decision, whether it’s buying a car or buying a house, taking a new job, deciding whether our beloved partners behavior was their fault or not their fault, every one of the things we’ve going to justify in some way. We’re going to have to beat this dissonance in some way. And there are more helpful, productive ways of reducing dissonance. And then there are unhelpful and self-defeating ways of reducing dissonance. For example, in every relationship, we’re always faced with the question of what how to think when our beloved partner does something dorky, stupid, thoughtless or mean. And happy couples do the thing you were just suggesting. They don’t blame the partner. They forgive the partner. They see the flaw as something the partner couldn’t help. That’s a rare thing, not part of the beloved, wonderful, adorable nature. Other people resolve the dissonance in a different way and they say, my God, I made a terrible decision to be with this person. This person’s behavior reflects their character down to their toes. And I’ve really made the wrong decision here. So in this way, in our daily relationships and how we are with our children, our parents, how we see our own behavior and other people’s behavior is going to have really powerful consequences for how we get along with them. And by the way, when we do say, honey, I was wrong. I am so sorry. I made a big mistake. Those people are not more likely to be miserable. I’m happy they get a terrific response from people around them. Don’t you like it when your friend says to you, T.J., you know that argument we had? Let me tell you, you are absolutely right. And I was totally wrong. 

I’m not sure how much I should add. 

Matt, I like that, but we all do. We all do. 

So you’re saying admitting the truth about ourselves, even when it’s a negative thing, can actually be socially advantageous, even if it’s going against the belief that most of us have, that we’re decent, honest, good people? You know, we we people tend to like themselves and think there may maybe even much better than they actually are. 

Yes, you probably know that study, too, that, you know, 80 percent of people think that they’re better than average. I’m from Lake Wobegon. 

You know, we are all better than average. People put an enormous amount of psychological energy into trying to protect themselves from feeling that they did something wrong or that they were mistaken or that they harmed another person. And the irony is that very often when people stop thinking that realization, when they’re able to say, wait a minute, you know, I’ve been I’ve thought all along that I was right about this belief. It could be something like hormone replacement therapy, could be a medical problem or a psychological issue or as scientific one or a personal one. And when people finally say, bye, I was wrong. It’s often a tremendous relief both to themselves and to people around them. And want to draw a distinction here between cognitive dissonance, which really is hardwired. The brain really is designed to to distort the way we perceive information, to keep our views consonant and to keep us working for the things we believe. But how we think about our mistakes is not hardwired. I think our culture is particularly mystic phobic, if you will. We’ve somehow learned that to admit a mistake is to admit that we’re stupid. And that is strictly a learned cultural notion. Many people learn as individuals and in plenty of communities or in organizations that don’t think this way, where people learn that they can admit a mistake and it doesn’t mean they’re stupid. Then you can create a world in which makes it more possible for people to say, you know, Blair really got off on the wrong track here. I finally have to face the evidence that I was wrong. And now what can we learn from this and how can we improve things? Certainly in medicine, in psychotherapy, in our companies, in the bar, we really hope that our leaders and that our professionals will be able to face the evidence when a new and better method comes along. 

I’d like to let our listeners know that you can get a copy of mistakes were made through our website point of inquiry dot org. If you listen to the show regularly, you know that I’m always raving about various books, but this is one of the half dozen or so since we’ve started point of inquiry that I’ve given to all my friends and family. So if you’re at all interested in what makes people tick and how to avoid some very human pitfalls, you need this book. Professor Tavaris, to get back to our discussion, I found it very interesting that you talk about how many people in prison are proven by DNA science, that their innocence. Yet despite that, many prosecuting attorneys refuse to admit that they’re innocent. You say that one reason there are so many wrongful convictions is it’s not only the arresting officers, but the attorneys, even the judges themselves experiencing cognitive dissonance and they just can’t admit that they were wrong. 

Yes, that’s exactly right. You know, we all tend to divide the world into, you know, good people and bad people. And what we try to sell here in this book is that some of our greatest problems don’t come from bad people, but from good people who do bad things in order to preserve their opinion of themselves as being good people. And you can see this very clearly in the legal system. Once the cops get a suspect and think that they’ve got the guilty person, they blind themselves to disconfirming evidence, to dissonant information. 

Right. You say that investigators are actually trained to believe that anyone they interrogate is guilty. Hence it justifies the amount of lying or manipulation that they might use to obtain a confession. 

Exactly right. And we see this, of course, with the present administration’s justification of interrogation technique, torture, because, after all, if we have detain these people, they must be guilty of what we don’t know. But if you’re here in prison, you must be guilty. And that alone justifies what we do to you to get you to admit that you did something wrong. Now, of course, any time you do have guilty people that you’ve arrested most of the time and you know, when people are arrested most of the time, statistically, they might be guilty. But that’s not how our system is based. You are supposed to prove that this person is guilty beyond reasonable doubt. And the danger of the confirmation bias is that once the system starts rolling along, information suggesting that the person might be innocent is systematically excluded. And it’s because the cops think they’re doing a terrific job. They pride themselves on their ability to know if somebody’s guilty by how they sit and how they squirm in the chair. They pride themselves on how their experience has enabled them to really know who’s guilty. Same with prosecutors. Once we go forward to prosecute this case, we’ll do everything in our power to get this person put away. So now you’ve put this person in prison for 10 years. DNA exonerates the guy. What do we do? How do we say, my God, I was wrong. I was incompetent in this case? Well, we say, well, OK. He might have been innocent of this, but he’s been and he surely is guilty of many other terrible things. So he was justified in putting him in prison. They have all kinds of justifications. OK. So he didn’t rape this woman, but he was there with another guy who raped her, and he probably did so, too. Only he didn’t ejaculate. So they allowed themselves in this way. They protect themselves from the awareness that they these good, competent prosecutors put an innocent person in prison because the justice system doesn’t make mistakes. It’s interesting that in Canada and in Britain, both judicial systems are now becoming more and more aware of the prevalence of wrongful convictions and have taken steps to reduce the likelihood of those convictions and of overturning them later if new evidence comes along. Our system seems still terrified of realizing how many wrongful convictions there. 

And you think maybe it’s part of the culture that our culture just doesn’t reward admitting mistakes? 

That’s right. That’s our culture in general. And the criminal justice system in particular. As we say in our book, as one prosecutor points out, it’s as if our criminal justice system doesn’t realize that any system run by human beings is going to make mistakes. And so if we really care about justice, we must have as many procedures in place to free the innocent as we do to convict the guilty and be don’t. 

Let’s switch gears a bit and talk about science and politics, kind of in the context of the self-deception you’re talking about. You don’t really touch about it in the book, but I’m interested in your take on the following, the politics surrounding climate science that says global. Warming’s real. Well, there are a lot of people who challenge this challenge did maybe for their own political or financial reasons. Yet I don’t imagine that there that there were a bunch of people in some smoky back room saying, okay, now we’re planning to for forever doom the the world, the future of our planet. But at least we’re going to make a buck. So how do we spread this misinformation? My point is that they actually seemed to start believing they’re anti global warming propaganda. So here’s the question, Professor Tavaris. How do you break through your own self deceptions and rationalizations when you have so many other people agreeing with you that you’re right, people justifying your position? The same might apply to a question about the president and the Iraq war, for instance. 

Exactly right. The most crucial antidote to the confirmation bias to our habit surrounding ourselves with like minded people is to make sure that we don’t. To make sure that we are surrounded by enough good friends and colleagues and coworkers who will disagree with us. The prototype for this in Doris Kearns Goodwin’s brilliant book about Abraham Lincoln, Team of Rivals is exactly the title of her book. Lincoln appointed to his cabinet. Many disagreed with him about how best to pursue the question of ending slavery in America. People who disagreed with him, people who had called him all kinds of names before the election, he wanted to expose himself to other ideas about how to proceed. And in fact, John Kennedy, after the disastrous Bay of Pigs fiasco, as it’s always called, learned from that, that he had created a situation of groupthink in which dissenters felt stifled and did not feel free to say their actual opinion group think. We all know it. We ideally would like to surround ourselves with people who agree with us. But, of course, the danger is when we’re in a position to make decisions affecting the lives of thousands or millions of people. In George Bush’s case, it’s an administration that has carried groupthink to staggering new heights where loyalty is rewarded, but not creativity or dissent. Great leaders understand the importance of permitting dissenters to speak of gathering all information about a problem before you make a decision. Being as open as you can to all possibilities and gathering information about all possibilities for making a decision in the same way. Before you buy a car, you choosing between two cars, you will find out everything you can about both of those cars. After you’ve chosen one, you will stop reading information about the car that got away. That is why it is really crucial, whether it’s in science or politics or business or in our private lives, to make sure we have people around us who will tell us that we’re about to really screw up. It’s not easy, but once people appreciate how important it is to get those dissenting opinions because they improve the quality of our own decisions, the better off we are now in science. The whole point of science is to force the scientists to hold up his or her belief for empirical examination, open up and see if they fly. 

The great psychologist Robert Avildsen once told a graduate student who was struggling and so this student would not let his hypothesis go. He kept working on it and working on it and doing studies, and all of them were failing and he wouldn’t let it go. And finally, able from April said to him, Are you going to finally admit what’s wrong with your hypothesis or are you going to go into print and let everyone else tell you that’s the goal. 

That’s the goal in science. To be able to test. All right. And let them go if they aren’t supported. And you don’t have to be a scientist to think scientifically, you can be anybody who understands how exciting and exhilarating it can be to let go of a bad idea for a better idea. 

So self-deception. You’re saying that it’s not the case, that it can only be corrected by the self, by the person deceiving himself, but by the people around him or her? Is there any way of piercing through it from the outside if you’re not in that circle? We just mentioned President Bush and the war in Iraq. We were talking about those other issues regarding the war in Iraq. The Iraqis didn’t greet us in the streets with flowers. There were no WMD. Oil revenues didn’t pay for the war like was suggest. They would and the mission wasn’t accomplished, as Bush said it was. And, you know, when he was on that carrier. So even in the face of overwhelming evidence to the contrary, people firmly stick to their deeply held beliefs. Here’s the question. How do you get through to people who maybe are possibly self-deceit leaders if you’re not part of their inner circle, that might even be able to puncture through certain beliefs with dissent? 

All right. You’ve just asked the crucial question, how do we puncture the protective cocoon of self-justification? And the answer is, most of the time we will not be able to, especially when the self-justifying person has so much invested in a course of action, especially when that course of action has caused enormous harm, difficulty and trouble to him or herself or to everybody around the person. The more that person has invested in a course of action, the harder it’s going to be to break that cocoon of self-justification precisely because it’s such a devastating realization to a person to realize. I just did something in my medical practice that caused the death of my patient. I, as president of the United States, made one of the most disastrous military decisions in our nation’s history, causing the deaths of thousands and thousands of people and upwards of a trillion dollars. What’s predictable about George Bush, as we predicted last year? Actually, one political scientist said if Bush were a rational politician, he will realize he’s lost the country’s support for the war and therefore he will change direction because otherwise they’ll lose both houses of Congress. And what rational politician will risk doing that? And the answer is, of course, not a rational one, but a rationalizing. And so students of self-justification like Elliot Aronson and myself predicted that Bush would do just what he did. More of the same, that self-justification in action. What do the rest of us do about it? Well, the rest of us do just that the country is beginning to do, which is to speak out against the war, hope that our opposing politicians and more important Republican politicians will start standing up on their hind legs and forcing this president and this administration to follow the law, to follow the dictates of Congress in the progress of war, thinking of alternatives and of ways of getting out. Well, certainly this conversation has begun. When you have a leader who is as self-justifying as this one, the dissent has to come primarily from his own party. He’s going to be influenced at all, really, or by a majority of the opposing party if he could be influenced at all. 

That’s right. He is not going to change. He is too far in and he is not enough of a statesman or a leader to say I was wrong. We’re not going to hear that from him. And it’s interesting, you know, politicians left, right and center have been writing speeches for Bush. I was wrong. I made a mistake. He’s not listening. 

We don’t often get into politics on the show. At least electoral politics are talking about in the way that we just were. But I appreciate you touching on it, given your background as a social scientist and this conversation about self-deception. Well, what an object lesson our current administration it is. 

But keep in mind, it’s not just about Republicans or Democrats. It is about self-justification. There has been plenty of Democratic self-justifying presidents. You know, we happen to be talking about this one as a Republican. And in my view, many people’s views have done a disastrous decisions in this country. But the mechanism of self-justification is not limited to Republicans. 

You’re saying if, in fact, it’s universal, it’s not even just an American thing. It’s kind of a species thing, male, female, you know, just everywhere, all of us engage in it in that context. Let’s talk about religious and paranormal beliefs since we touch on those subjects on the show quite a bit. Let’s talk about their connection with cognitive dissonance or dissonance theory, whatever it’s called. These days, it seems like Athie ism is all the rage with these big time bestselling books out there by scientists and public intellectuals writing against religious beliefs. Do you think that the fact that religious people increasingly I mean, there are sections of the religious there are groups of the religious who don’t demand proof for their beliefs. You think this is a way of alleviating their cognitive dissonance? The same could be asked about those who believe strongly in unsupportable. Believes in the paranormal, like UFOs or faith, healing, Bigfoot, ghosts, et cetera. They’re kind of just believing because they believe. 

Yes. Well, I think the more important a particular belief is to us, the more strongly we will ignore or reject evidence suggesting that we’re wrong. OK. So what are the most essential beliefs that people hold their religious beliefs, their political beliefs? Certainly many scientists have held beliefs deeply that have taken a few hundred years to overturn. OK. It’s not the scientists always say scientifically either that religion is the big one, of course, because religion is central to many people’s feeling of what gives them meaning and purpose in life. When you have a belief that is that’s central to your meaning, you are going to defend it at all. People of all religions now, millions of social scientists. What’s interesting is how people reduce the dissonance between my religion says this. But now how do I deal with that? OK. So evolution is a good example. Most religious people believe in evolution and feel no discrepancy, no dissonance between Darwin and their religious views. And, of course, other fundamentalist Christians do. So you have to start with sort of what the religious belief is and then what the disconfirming kind of information is that comes along with to. For example, consider the massive dissonance that would be evoked by we believe in God, a God that is looking after the chosen people and the Holocaust. How do we explain the Holocaust? How could God have permitted such a devastating act of genocide? What good loving God that cares about that is would have permitted the Holocaust to occur. That’s that massive decision. 

Yeah. Given the all loving nature of the creator, how could he have let that happen to the chosen people? How can you justify that? 

Exactly. So what’s interesting is see, students of self-justification of this theory would say, how would you predict that people would resolve that, but they become less religious or more religious? 

Well, you don’t hear stories about rampant Athie ism in the death camps. This is a kind of a heady subject to talk about. But you hear about some people’s faith being strengthened. 

Exactly right. Exactly right. It’s counterintuitive, not obviously action, and that’s the power of dissent. So strong is the need to believe in God that when a terrible thing happens, what most people will do is not lose their faith in God. But the dissonance by saying something like, as I heard in one temple, God is responsible for the good in the world. Human beings are responsible for the evil or God is testing his chosen people’s faith, which is, of course, the Christian view as well. The Christian response to the enormous suffering that many people experience in their lives. How could a good, loving Jesus Christ permit this to happen in my life? Well, he’s testing my faith. So there are many ways of coming to accept these sad, tragic, horrifying experiences that people have and to maintain their belief in God. In fact, many people would say, well, you know, having a religious belief is so important to keeping people feeling safe and protected and giving them meaning in life that any thing that happens that doesn’t seem to be confident that their belief in God is reinterpreted to make it consonant. You know, you hear this after a terrible plane crash when you know, a terrible disaster where the survivor says God was looking after me. 

Right. God wasn’t looking after all the people who didn’t survive. But God was on that person’s side who did. Both you and Elliot Aronson are scientists, your social scientists. And you mentioned a few questions ago about how it’s part of the spirit or the attitude of science to change your own beliefs when you’re discredited, to surround yourself with those who kind of keep you honest, despite your own your own kind of commitments to this or that view. But most people out there, maybe I have a low view of people in general. I don’t think I do. But most people don’t share that scientific spirit. So do you think that pointing out to people just how wrong they are to believe weird things, whether it’s an extreme religious belief for. Trying to discredit their belief in this or that paranormal claim. Do you think pointing out to them that they’re wrong will help them overcome their self deceptions? Or will it just make them more entrenched? I’m thinking again of these bestselling skeptical books like Richard Dawkins. I guess I’m asking, does really forceful arguments, compelling arguments actually get through to people who believe nonsense? 

Well, T.J., as someone who has read our book and understands cognitive dissonance, you would know exactly what the prediction is. 

Right. That I guess that was called a leading question, I guess. 

Very good. This is exactly right. I know. And actually, this is one of the things I think is so important for scientists to understand how cognitive dissonance works, because when they go around saying, oh, look how foolish you are to believe such and such a thing, what they’re doing is putting the other person into a state of dissonance. I am a smart, capable, wise, kind person. And you’re telling me that I believe something that’s stupid and wrong leaving home with you. So to understand dissonance is actually important in understanding how to persuade other people, because you can’t do it by making them feel stupid that they hold such and such a belief. It works for us, too, right? When someone says to detail, let me tell you about something. Here is something you’ve always believed in. I’m going to tell you how wrong you are. You’re not going to typically be open minded to it. And let’s say it’s not something you believe strongly or care very much about, in which case you would welcome new information about it. 

In fact, that’s maybe a bias some skeptics have against some paranormal claims. They don’t even look at the evidence. They just dismiss it out of hand. So it works on both sides of the fence. 

It does. I think it’s really important for skeptics and scientists to avoid that tone. We know what’s right and you don’t. We’re smart because we’re scientists and you’re not. Oh, we have the bat and ball and you don’t. OK. Not only is the tone off-putting to somebody that you’d like to persuade, but it won’t be persuasive. It’s a tone that makes the other person defensive end even more likely to protect their particular views. It’s important in an argument whether the argument is about creationism or about medical information, the importance of good medical research as opposed to alternative medicine, for example, to understand what the purpose of the belief is to somebody who holds it. Why do they want to hold it so tenaciously? What it means for them before you can just go along and tell them that the evidence doesn’t support their view? I think that this is really important for scientists to understand. For example, I was talking recently to attorney who’s been very involved in civil liberties cases and religious issues, and he says, you know, the issue when the issue was framed between science and religion. So, you know, if you ask the general population the question, you know, that forces them to choose between science and religion. April 10th, religion, because they think that’s the right thing to do, because they care about religion and because many people aren’t religious. But if you say that it’s if you make the evolutionary argument, not one in which you have to choose between believing in evolution and believing in God, but if you say instead, this is an issue that divides religious people. Some religious people believe in evolution and some reject it. What are the reasons for accepting or rejecting it within a religious framework? See, then you aren’t making religious people feel they have to reject science. 

I’m with you on all that. But what if what if it’s true that it is science versus religion? Should we pretend that it’s not just because it has a better PR spin? 

No, no, no. The job of scientists in my ideal city, nothing is to educate the general public about what the scientific method is. It’s great beauty. It’s revelations about human behavior, about medicine, about the body of that cosmos, about the world that are awesome and the real meaning of awesome and informative and helpful and that make things better for us. Science is not just its findings, its particular findings that apply right now. Science is a way of thinking. It’s a method that keeps us moving toward improvement. Do we? I want our doctors to wash their hands before they do an examination. You bet we do. Do we run psychotherapies to give up disastrous methods of therapy that are harmful and have been known to tear up families who badly want them to give them up? Do we want our best interrogators to know, to know exactly how it can be that an innocent person will confess to something they didn’t commit because of some method you’re using that you think it’s scientific? It isn’t fair. We want them to have the best methods. So what scientists can say, I think, is the tillawi of the process of discovery. For scientists, dissonant information is exhilarating. It’s something you learn from. It’s not threatening to your ego. Ideally, of course, since it is. But ideally, it’s not. And if scientists can can convey that. Here’s what we do. We aren’t just debunkers. We are creators. Just people who tear down things you believe. We provide new ways of seeing these problems and new solutions. Scientists do best to stay on their territory. What science is and what it can do and not try to take religion away from people who aren’t going to let go of it. 

I want to finish up, Carol, with talking about some of the specific scientific proposals you make for overcoming self-deception, not just in terms of people’s deeply held beliefs like religion or the paranormal. You don’t touch on that a whole. That’s not the focus of your book, but just the you know, the general pitfalls of cognitive dissonance. You say that one of the reasons people get into the position of justifying their harmful behavior, of deceiving themselves about their beliefs is because they don’t really know that much about how to resolve conflict. So when they get in a fight with someone, it kind of backs them into a corner and makes them more dogmatic about what they believe about their behavior or whatever than they normally would be. In other words, they make their opponents all bad and themselves all good. 

Well, I would say that one enormous benefit of understanding how cognitive dissonance works is that as soon as you get it, get it. You actually start seeing it at work in your own mind and in other peoples. We have heard this over and over from people reading the book thing like it was like looking into a mirror. And now I suddenly see something I’m doing that I never realized I had been doing. One man, for example, told me that I was thinking about memories I had of my father. And I realized, just as you said, I was justifying my view of events, of course, with my father and not even thinking about it from his point of view. Dissonance. It’s like we are aware of optical illusions. We know the eye will create many illusions, but we can’t do anything about an optical illusion that we can do something about our mental illusions once we understand how they operate and see them that work. We can stop ourselves in our tracks and say, is this decision really the wisest one? Is my interpretation of this experience. The right one? Is there another way of seeing this? Can I get out of my little self-justifying spiral here? And there’s something I should be looking into that I’m not. And once people get into the habit of seeing things that way, it’s really something you can apply in your own life almost immediately. 

We work on campuses across the country. We have this network of rationalist campus groups. And it amazes me that on every college campus there are all these, you know, campus groups, the Baptist campus group, where all the Baptists hang out with just themselves, the Muslim students, with just the Muslim students, the science club. There are these atheist clubs were the atheist only hangout with the atheists. You say in the book, kind of one of your proposals is that we should consciously widen we should widen our circle of acquaintances and friends to include people very unlike ourselves, to reduce that us versus them mentality. You touched on this earlier in terms of surrounding yourself with advisers who disagree with you. I found that advice spot on. 

Well, there’s two things that people unlike you and people who disagree with you. Here’s the thing. People like to hang out with other people who will like them if they’re a chess player or if you’re a cello player or something, you’re going to want to be with people who have shared interests and attitudes like yours. If you care about women’s rights, if you have a political agenda of any kind, you’re going to want to be with people who share your agenda. Nothing wrong with that. But if you. Mirro your world to people who only see things the way you do politically or religiously, but you won’t learn anything, you won’t expand your horizons. You won’t learn anything about the great diversity of opinion among Muslims or among Jews or among Christians or whatever the group might be or among atheists. 

Last question. If someone has the guts to do what you’re saying, examine all his or herself deceptions, what’s next? I kind of got this earlier, but do you really think that the truth at all costs, no matter how painful, if we admitted about ourselves, it will set us free? 

That’s a beautiful question, will the truth set us free? I’m in favor of truth. And I think very often they do set us free. They can be painful to acknowledge T. Maybe I was partly responsible for that rift in our family. T. Maybe I did something with my patient or my client that made their problems worse rather than better. Now there’s two two aspects of this that are crucial. One, for our own sake, is facing the responsibility for the things we did that was so harmful or wrong. But second, there is no purpose in that. Might as well go along justifying it. If we don’t, then learn something from it so that we don’t keep repeating it. That is the whole purpose of trying to shatter that that shell of self-justification so that we don’t keep doing it, so that we know we. It’s a cliche. We should learn from our mistakes, but we can’t learn from them if we don’t admit that we made any when we can see the complexity of that memory that we hold. For instance, that so-and-so did this and it was all their fault. Well, maybe it wasn’t. What happens then is a richness of understanding, a way of letting go of grievances and blame and using many of the wounds that we’ve been inflicting on ourselves. You acknowledge the mistake you give yourself. You bring some compassion to yourself. But the thing that you did this is wrong. But then you don’t repeat it. So all of those, I think, are elements of the process. 

Thank you very much for joining me on point of inquiry, Carol. 

Tavaris, it’s been a pleasure, T.J.. Thank you so much. 

You’ve seen the headlines, Bill seeks to protect students from liberal bias. The right time for an Islamic reformation. Kansas School Board redefined science. These stories sum up the immense challenge facing those of us who defend rational thinking, science and secular values. One adviser to the Bush administration dismissed as the reality based community who could have imagined that reality would need defenders. The educational and advocacy work of the Center for Inquiry is more essential than ever. And your support is more essential than ever. Show your commitment to science, reason and secular values by becoming a friend of the center today. Whether you are interested in the work of psychology and skeptical Inquirer magazine, the Council for Secular Humanism and Free Inquiry magazine, the Commission for Scientific Medicine, or a Center for Inquiry on campus. By becoming a friend of the center, you’ll help strengthen our impact. If you’re just learning about CFI, take a look at our Web site. W w w dot center for inquiry dot net. We hosted regional and international conferences, college courses and nationwide campus outreach. You’ll also find out about our new representation at the United Nations, an important national media appearances. We cannot pursue these projects without your help. Please become a friend of the center today by calling one 800 eight one eight seven zero seven one or visiting w w w dot center for inquiry dot net. We look forward to working with you to enlarge the reality based community. 

Thank you for listening to this episode of Point of Inquiry to get involved with an online conversation and to debate some of these topics or really any other topic under the sun, go to our online discussion forums. CFI Dasch forums, dot org. Views expressed on point of inquiry aren’t necessarily CFI views nor the views of its affiliated organizations. Questions and comments on today’s show can be sent to feedback at point of inquiry dot org or by visiting our Web site. Point of inquiry dot org. 

Point of inquiries produced by Thomas Donnelly and recorded at the Center for Inquiry in Amherst, New York. Executive producer Paul Cook’s point of inquiry’s music is composed for us by Emmy Award winning Michael. 

Contributors to today’s show included Debbie Goddard and Sarah Jordan. And I’m your host DJ Grothe.