This is point of inquiry for Friday, July 17th, 2009.
Welcome to Point of Inquiry, I’m DJ Grothe Ronald A. Lindsay point of inquiry is the radio show and the podcast of the Center for Inquiry, a think tank advancing reason, science and secular values in public affairs and at the grass roots level. My guest this week is Zachary Shaw. He’s associate professor of National Security Affairs at the Naval Postgraduate School and a senior fellow at UC Berkeley’s Institute of European Studies. He’s served on the policy planning staff at the U.S. Department of State through a fellowship from the Council on Foreign Relations. He’s the author of Breeding bin Laden’s America, Islam and the Future of Europe, and also the book What Hitler Knew The Battle for Information in Nazi Foreign Policy. He joins me on the show to talk about his new book, Blunder Why Smart People Make Bad Decisions. Welcome to a point of inquiry, Professor Zachary Schorr.
Thanks for having me. Professor, your book, in the best sense of this term, and this can sometimes be an insult. I guess I don’t mean it that way.
It is a self help book helping people overcome these pitfalls of thinking that you identify. Have you gotten a sense that this book has actually helped people or is it, you know, kind of an intellectual exercise for you?
Actually, it’s funny you said that, but it looks like a self-help book. I wrote it as a history book. I’m a historian by training and it looks at decision making at all levels, on the individual level, at the business and organizational level, and then on the international level dealing with our foreign policy and military affairs. He might think of it as self-help for nations in need. But it really does feel with also are our everyday decisions from international relations to romantic relations. It covers a lot of territory. And yes, I have found a lot of people writing to me and telling me that this actually has helped them have received a surprising amount of email once. Hello. Wrote me about his relationship, that it ended after eight years and asked if my advice on what he should do. You know, I’m not a therapist. It’s not something that I can really deal with.
But what I try to do in the book is step back from from what most psychologists and neuroscientists and sociologists and other social scientists would say and ask the question, what can we learn about decision making from a historical perspective by looking at people in everyday context with all the real world pressures they have to deal with rather than the experiments someone would do in a lab. This is a very different way of thinking about decision making.
Right. I’m glad you drew that distinction. Of course, it’s not as it’s not a pop psychology book and most people think self-help kind of in that category. The book, though, certainly is about critical thinking and how to avoid what you call cognition traps. And as you mentioned, you’re not coming at it from cognitive neuroscience or psychology. But as an historian, do you think your perspective on history reveals more about these blunders than, you know, say, Vacher logician or critical thinker, advocate or a psychologist or a relationship expert? In other words, you’re kind of, in a sense, out of your field. You’re not just talking about history, but you’re applying lessons of history to the workaday world.
Did you? I’m really glad you said that, because actually it’s precisely the field of history that tries to understand why people behave the way they do. The social sciences do the same thing in many respects off the field. You just mentioned psychology and neuroscience and so on are also concerned with how people think. They just have different methods and approaches for getting there. And what history and historians try to do is not just figure out what happened when, but why people acted as they did. It’s really like being on a murder mystery or any kind of detective theme. But the villains and the victims are all gone. And they lived in different circumstances from you at different times. And you have to get into the minds and hearts of these people you’ve never met. And so the way we did it is by looking at all kinds of archival evidence and whatever records we can get our hands on, looking for corroborating information and piecing together what they were thinking at the time. And so, yes, it’s a different approach and it does reveal a great deal more. Let me say this. All fields have something to contribute to this. And historians have too long been too quiet about the contributions they can make to an understanding of how we think. Mm hmm.
I consider myself, professor, a smart person. Should I also joked that I pride myself on my humility. Right. Even if I’m. A smart guy in my 36 years, I’ve made some impressively bad decisions, and sometimes I tell myself, well, I made those bad decisions, you know why? Well, it’s precisely because I’m so smart and it’s, you know, intelligence is a disability. This is a joke. You know that the Mensa types often backhandedly complement each other about Jim Underdown. You know, it’s like the guy in Princess Bride who kept changing his mind about drinking the drink with the poison in it. You know, you overthink things and you think that’s a reason, actually, why some people make bad decisions. It’s not just self-deception to say, oh, smart people make bad decisions. You think sometimes bad decisions are made precisely because we’re smart?
I’d put it this way. It’s not just that were smart or foolish when we make bad decisions that we’re too rigid, too rigid in our thinking. And that’s what these cognition traps are. What I tried to do is identify in each chapter a different, rigid mindset that we tend to fall into. And then I give historical and contemporary examples of how decision makers got locked into those mindsets and how they in turn led to blunders. So we get a wide range of cases and I draw on not just history, but also folklore and literature and film and even poetry. Anything to get us through this really complicated realm of the mistakes we all make. You know, you’re not alone, T.J. It’s something we all do. And what we’re trying to do is figure out what are those wretched mindsets we fall into.
So let’s get into some of these rigid mindsets, these blunders that you show us how to avoid. First, you talk about kind of this natural characteristic. We all have to not want to seem weak to our fellows. And you’re calling it exposure. Anxiety. The psychologists have other names for it may be the same thing. How do we avoid exposure? Bring anxiety, I guess. First, tell me more about what it is.
Right. Okay, so let’s start with what it is to lead in I. I talk about an old Japanese legend of a man who is renowned for his table manners, and he’s invited to this village one time for a big feast. Everyone gathers around and they’re watching so attentively at every step he takes, how he holds his chopsticks and what angle and every detail. And at one point, he accidentally puts a little too much pressure on one chopstick lifting a sphere of tofu to his mouth and the tofu shoots out from the chopsticks and lands with a splat on the lap of a person sitting next to him. So, of course, not wanting him to be embarrassed. They are so that this must be the right thing to do. So they all start flinging tofu at each other and services? Yes, ma’am, of course. I think what the Japanese are trying to suggest in that story is that we have a tendency to not a very strong aversion to being seen as weak or flawed and also to cover up for other people who might be, we don’t want to know, could be weak or flawed. And what we tend to do is make the situation worse by reinforcing a mistake rather than correcting it. So rather than saying I shouldn’t, I didn’t mean to do that with the tofu. Instead, we just exacerbate the problem. And you see this actually in all kinds of international and individual relationships. Take, for example, when I say in the book, the Israeli prime minister, Ehud Olmert, and the Israeli attacks on Lebanon in 2006 summer. We even have speeches by him when he went before the Israeli nation and explained why they were doing this. He said, we cannot allow our enemies to think that we are weak. We must respond with a firmer hand. And this is the great tragedy of exposure, anxiety. It’s that statesman, decision makers and even bosses. And in all kinds of relationships, we tend to react even more strongly when we’ve made a mistake and reinforce it rather than trying to correct it. Which people would usually respect more if we got up and said, hey, we’ve got to run. Let’s redirect our course.
But isn’t that a kind of a central idea in international diplomacy?
Kind of a core principle also in what, you know, the Sun Soo’s Art of war or, you know, political science, political philosophy that, you know, when you’re negotiating, you don’t want to negotiate from the weaker side.
You want to appear or actually be strong. And, you know, I mean, you read a history of Kissinger’s involvement geopolitically. You know, there’s a whole kind of school of thought, be strong. Don’t be weak. You don’t have a beef with that.
Your argument is not to try to be weak. And it’s not that you shouldn’t negotiate from a position of strength. It’s that when you have made a mistake, it’s. Usually wiser to correct it, and you’ll be in a stronger position. You’ll be dealing from strength if you’re actually correcting mistakes rather than exacerbate them and perpetuate them.
So the way that we avoid exposure, anxiety is just what to admit when we when we’re flawed.
Admit our mistakes. Right. Is it Foreswear?
Recognize it. Have people around you who are not just mean, but who will tell you look on the wrong course. We need to redirect and change track and soberly assess the situation, say, OK, you know what? We have made a mistake. We’ve got to correct it. It’s remarkably hard for people to do. It sounds so simple, doesn’t it? Right. But it’s so rarely done.
Is there an episode in history, Orien? Statesmanship, where exposure, anxiety, like is there a success story where someone avoided the blunder of exposure, anxiety and kind of counterintuitively admitted a flaw or a mistake or a weakness and it paid off?
There are and I’m really glad you raised this, because it’s so easy to just say in hindsight, oh, these people were so foolish. I did not want to do that in this book. So instead, in every chapter, I not only point out the cognition traps of people who fell into them, but I also contrast them with people in comparable circumstances who managed to avoid those cognition traps and the blunders that followed. And I try to figure out what did those people have that the others didn’t.
And in this case, one of the things I cited, Kennedy’s inaugural address was a beautifully worded statement of how to avoid exposure, anxiety that we should never be afraid to to deal with people we disagree with. Never be afraid to talk with them. This is not a sign of weakness, but a sign of strength. And of course, the irony is that Kennedy’s argument administration had its blunders, of course, in the Bay of Pigs and other incidents as well. So there’s no guarantee that you will always avoid them. But the one thing that does help is a greater cognizance of cognition traps. If you can be aware of them, you’re more likely to avoid them.
Another one of these blunders to avoid one of these cognition traps is what you name cause fusion, which is confusing causes. Right. You know, confusing one cause for another. Is this just the old logical fallacy, the old post hoc ergo propter hoc thing?
Now, I’ll tell you why this is such a complicated chapter. I people have told me that this was the most one of the most satisfying in the book by Rithy. A bit of a challenge to get through it, because it’s it’s not what we usually think. I think we tend to get the idea that correlation does not equal causation. I think we get that. But what I find through this historical case studies is something very different is happening more often. And I’ll try to walk you through that. One is that we see the causal flow. What we miss, important links and the causal chain.
So, for example, we see a C and then they read to other, but we missed important links being delayed. We wouldn’t realize that that was what was going on. And so, for example, in our treatment of cancer, scientists looked at what are the things in carrots and other vegetables that help fight cancer. And they found things like beta carotene. And so they said, alright, let’s extract beta carotene from these vegetables and we’ll put them in a pill form. And you know what happened? Taking those things actually increased the rate of cancer in some cases, because what they had done was they identified a certain chain of causation without seeing the whole mess. She links that beta carotene had to function within the whole eating process. The vegetables go into the body being digested with other other things going on. You find this happening in international relations and all kinds of relationships where people miscued links in the chain.
Right. A Liko you went into the end of the Cold War to kind of illustrate cause fusion. Lots of folks try to argue that Ronald Reagan almost single handedly ended the Cold War with the whole approach. You know, tear down this wall, Mr. Gorbachev. And yet you think that this itself is an example of confusion with love, reductionist answers by nature.
We just love them because it’s so easy. We like to get a nice bite size solution. Some answer we can hold onto and say are that what’s the cause of this complex event? Usually complex events have very complex causes. We have so many other factors, especially in the end of the Cold War. And in any complex event, you have to see what the full chain of events was. In this case, there were the actions of Gorbachev, the Eastern Europeans and the pope and many others who were actually far more important than the actions of Ronald Reagan. Not the Reagan wasn’t an important factor. But to reduce a complex event to a single causal explanation is a mistake. We often make that with too many blunders in our decision making, because if we think that a war was caused or a war ended because of a single causal factor, we’re very likely to get things wrong the next time we try to get into a war or out of one.
You don’t go into this in depth in the book. But I’m just curious if you think this enters into the public policy discussion or the hand-wringing, not just in public policy, but kind of world politics and the war on terror, stuff about Islam, the religion being the cause of terrorism?
Yeah, of course. That’s another example of reductionist thinking. Actually, I have another book before Plunder called Breeding bin Ladens America, Islam and the Future of Europe. And that one feels a lot. With this attempt to reduce everything to Islam.
Jim Underdown. This is a topic we yammer about on the show occasionally, and we’ll have to have a conversation about just that one topic. We had a yarn here, Sierra Leone, recently, and she kind of laid all of the problems of terrorism on the feet of radical Islam. And I have to confess, I’m persuaded by a lot of those kinds of arguments. So I’m interested in your possible corrective to that line of thinking.
Radicals of any kind will cause problems. Absolutely. But it’s not the religion of Islam itself that is causing problems in international relations.
Well, yeah. So we’ll unpack that. A future conversation will plan on it. Tell me about David Karp in you know, in terms of causation, the professor of sociology at Boston College. He was also a depressive. How does he illustrate for you?
But the problem of cause fusion, I think I might have had causation before, but I’m talking cause Fusion Kalfus a really interesting character, because not only was he a professor of sociology, but he he suffered from depression for decades and he always reduced the problem down to a simple solution, like taking medication for him. And these things would work for a time, but then the depression would flood back in. Finally, he thought he had to address the problem altogether. We’re now coming more and more to understand that causal flow here, maybe just the opposite of what we think. There is a common perception that first there’s a chemical imbalance in the brain. And then you get depressed. So if we could just redress the chemical imbalance in the brain, the depression would go away.
That’s kind of the medical model of mental illness. You know, it’s just brain imbalance. And you, Jimmy, with the brain, you slosh around.
Your noggin, you’ll get better and more and more people are discovering that can work the other way and in fact be more effective. I give the analogy of imagine if you look down into your palm and realize that we’re spurting blood, but somehow it had been cut and you went to your doctor and the doctor said, oh, yes, your wound has been caused by a chemical imbalance in your hand. And I can prove it to you. Look, there are all these extra white blood cells and red blood cells rushing to the side of the wound. That’s the imbalance. We have to redress that. And at the doctor and say, you’re crazy, right? Because you know that you must have cut your head first. You had a wound inflicted. And then came the chemical imbalance rather than the other way around. And in the same way, many psychologists and others who study this problem. Medical doctors and medical science is finding that other other causal flows are possible. First you get the wound. Then the depression comes.
And the chemical imbalance in the brain is a consequence rather than a cause and wounds consistent things like actual physical wounds in the brain or just or things happening in someone’s life.
You know, outcomes are often thought of the things happening in someone’s life and that are perfectly reasonable and plausible to make one depressed. We have a very, very odd and I’d say twisted way of approaching depression. I’m not suggesting, by the way, that it isn’t possible that you can have a chemical imbalance causing depression or the drugs can be very helpful to solve. It can be right. All I’m suggesting is that that may not be the typical way that that depression works. It may not be the most common. It may not be the only way of thinking about this. And more and more people are studying this and realizing, actually, if we think about it in a different causal relationship, we can be more effective at treating it. I think that’s the most important thing.
Mm hmm. I’d like to let our listeners know that you can get a copy of Bonder, Why Smart People Make Bad Decisions through our website. Point of inquiry, dot work. Professor Shaw, you decry uni dimensional thinking in your book when you talk about what you call flat view, kind of the flat view approach, which is another blunder. Things are a lot more complex than we think they are. And that’s, you know, that’s so obvious. It scared the I mean, it’s kind of a truism. Why not just throw our hands up in the air in kind of this agnosticism and say, well, we can never understand things completely because things are very complex? Or is it actually possible that we can go about figuring out complex things like markets or economies or whole nations? You’re dealing with really big effects of these blunders in your book. You’re not just talking about the guy who’s having a hard time with his girlfriend. You’re talking about nation states, you know, and you know these big movements in history.
Right. I talk about flat view, a way of seeing the world in just one dimension where you’re either with us or against us. Here, you’re either capitalist or communist. We’re good or bad. It’s a very, very unfortunate way that many statesmen have a way of thinking about them, much a statesman, many of us. This is a book against reductionism. And we can all say, oh, OK, that’s obvious. But it’s remarkable how often we fall into this trap of flat view. Let me give an example. When we think about globalization, this is a pretty big and complex issue. But the way that we have our discussions about it tend to frame globalization as either a net good net. All right. Either it’s happening and people are really profiting from it on the whole, or it’s making things much worse. And I think we have to look at it in multiple dimensions to understand what’s happening, because usually these these things are occurring simultaneously. Globalization makes some people richer in some areas. Well, in the same area to make those same people, while they may be richer, less secure, because it’s also drawing in people and increasing the number of slum dwellers in the world and thereby making areas less safe. It’s an issue that has to be looked at from a perspective that it’s not good or bad. It’s having multiple complex effects. And we have to address them all. If we start by having a flat view of globalization that Peter, good or bad, then we have policies that get us on the wrong track. We’ve got to be able to address both the good and the ill of it and the fact that it changes over time and in different places. In different contexts. Right.
So as a historian, it sounds like your your big push is accepting ambiguity. It’s neither black or white, you know, or you answer hard questions like philosophers do. Yes and no. There’s no simple answer. And that works as a college professor. You know, when you’re talking about these confusing things in American history or world history that professors deal with military history. But when you’re living your everyday world, you want simple answers. You want clear paths.
Right. Thanks for letting me point that out. Of course. Yeah. No need to throw up one’s hands. Here’s the thing. We have to start out by accepting the complexity and the ambiguity, and then we’ll be in a better position to come up with ways of working around them and dealing with them. Some guy and I don’t think of the solutions I propose as necessarily Charles, right. They’re not. But they are ways of dealing with this ambiguity and complexity that I think will give us a chance of avoiding blunders and making wiser decisions.
Professor, I want to get into some solutions to all this stuff. But I should say for our listeners, you cover all kinds of other blunders, other reasons. Smart people make bad decisions. Would you mind just recounting a few of those and then we could talk about some of the solutions?
Sure. Well, in addition to exposure, anxiety, the fear of being seen as weak or flawed and constitution confusing the causes of complex events, too, we’ve just talked about and flat view seeing the world in one dimension. There’s also Infomania I find is prevalent. Infomania is an obsessive relationship to information where we have two types of Infomania Maniac’s the for misers people who hoard information and never share it with people who could help them to avoid a plunder. And then the info voyeur’s people who actively shun information they need to avoid a blunder. Then there is mirror imaging, which is the one term that I’ve not coined. It’s something historians and others often talk about where we think that the other side will think and act like we do. Then there is Curole ism, Curole ism. Thank you. Which is believing that you can have a one size solution. That’s where we take a solution that actually has been effective and worked well in one area and we start applying it to places where it would never. Nevertheless, we see a lot of that in arguing of international financial crises. World Bank and the IMF are very much guilty of this type of thinking. As many have pointed out, including Joseph Stiglitz. And then I end with static cling. That’s what I call this refusal to accept a changing world. It’s where you just can’t recognize that a fundamental shift has occurred in the environment around you, whether it’s clinging to an old belief of racial supremacy or inability to accept that something like climate change might be occurring and taking measures to to alter that, it’s getting locked in an old way of thinking. We’ve got a claim.
If you had to highlight one of these blunders as the most devastating kind of blunder people make, could you. Or is it is it the whole package? We make them all and not just one or the other.
We do make them all. And what’s interesting is that in the penultimate chapter called Cognition Trapped in Iraq, I show how in most plunders you’ll get two or three of these cognition traps coming together. But in Iraq, in America, in Iraq, at least the first five years of occupation, we see all seven of these most common coalition troops all coming together to undermine our success there. So it’s probably impossible to say that one cognition trap is more destructive than the others. We’ve fallen for them all. And you have to be able to. Them all, and I guess this actually leads in to very nicely the idea of what kind of solutions we can have to avoid in blunders. The first one is to be cognizant of his commission traps. If you just run down the table of contents of this book and had nothing else, you put to a cognition trap checklist each time you were developing a policy or a way of approaching a relationship or any kind of complex problem. And you’d say, I’m like guilty in this case of exposure, anxiety or cronyism or constitution. And just the mere fact of trying to be alert to it could help you to avoid it. But the other two things that really helped the decision makers in this book who avoided blunders, they did two things more often than others, and that was they showed much greater empathy and far more imagination in empathy. They got not just into the minds of the people they were dealing with, but their hearts as well, and tried to figure out what they were really feeling. Take, for example, a doctor in the book, How Doctors Think. Jerome Groopman really a good read. And it shows how Dr. Groopman, the author of the book, had damaged his wrist in some way. And no doctor could figure it out. They all just wanted to Russian do surgery or they had quick, slapdash solutions until finally he found one very open minded younger doctor who not only x rayed his damaged wrist, but also his healthy wrist and very carefully compared to try to figure out what the difference was. And then he took them in different positions with both hands holding cups and throwing balls and so on, different pictures. And by doing that, he was able to actually determine the problem and solve it. That doctor was the only person that Jerome Groopman saw who had real empathy for his patients to try and sit down and take the time to figure out with him what was happening inside. Dr. Recruitments Body. This is so important because it works in statecraft as well. When President Eisenhower was being pressured by all of his advisers, including Richard Nixon, then vice president, to go in to Vietnam in order to defend the French, are just going to fool 1950, three foot four when they were under attack. Eisenhower said, what? I’m going to do it. And the reasons he gave were extremely revealing. He put himself in the minds and hearts of Vietnamese people and said, you know what? They are going to look at us as more colonizers, just like the French. And it’s not going to be beneficial to us or anyone, really, if we if we pursue this. He kept us out of Vietnam. His successors, unfortunately, did not have the same kind of empathy.
It seems also that this empathy you’re talking about, you can’t have it unless you have imagination and lessons to imagine yourself in the shoes of the other person.
Exactly right, T.J.. And the imagination goes even even further than that, because it’s not just about imagining what the other person thinks and feels. It’s imagining creative new solutions. So what was the good decision makers in this blunder book had in common is that people often presented to them a fixed set of possible options, a BNC. They said, here are your choices and see is maybe the least bad of the three. It looks like it’s the best one. And the good wise decision makers said, you know what? I don’t like any of those. I’m going to come up with deer. And they devised some totally new way of thinking about the problem. And a new approach to it that took a lot of imagination to see what is being presented to me is not all that they’re making come up. I can imagine and envision something completely new. Those are the things I think if we could work on and build on. I suggest some ways that we could do that, building our empathy and imagination. We could make better, wiser judgments in all our lives.
So to finish up, you’ve focused on empathy and imagination as really two of the big things that are required to avoid these blunders. And throughout our conversation, you’ve stressed other things, you know, not being reductionistic, you know, being flexible in our thinking. In other words, reject these rigid ways of thinking that lead to the blunders. You have to be okay with ambiguity or uncertainty. If someone applies all this right and aside from buying your book, which I recommend, I enjoyed it a great deal. Now, do you think it’s actually possible for people to clean up their thinking and largely avoid bad decisions in their lives? In other words, is it kind of, you know, what, a 12 step program for idiocy or something? You. You just follow the steps in the book or the suggestions, I mean, in the book. And and you’re not going to make blunders, T.J..
I know it will help. No one should come along and tell you that they have a Tuross solution that will solve all your problems and help you to never make another mistake, another major blunder again. That’s that’s ridiculous. No one has such a thing. All we can do is get us more alert and attuned and partizan types of rigid mindsets we have, and that will make a difference. And one fellow wrote to me and said, you know, we have a business here and Cisco, a computer software company. And the other day we all got fixated on a particular problem on our server, went down. And had we done a cognitions checklist, we would have realized that we were absolutely focused on the wrong thing. But we suffered from Kreiss Fusion because we couldn’t imagine a different flow of causation could have occurred. We had it completely backwards. It was actually the thing we thought was the consequence was actually the cause. And now that we’ve said, now that I’ve read your book, we’ve instituted this in our company and we we don’t assume a certain direction of the cost of flow. And it’s actually saving us time and money.
Well, Professor Zachary Shaw, thanks so much for joining me on Point of Inquiry. It’s absolutely my pleasure. Jim Underdown.
Where can you turn to find others like yourself who appreciate critical thinking, turned to skeptical Inquirer magazine that separates fact from media myth. It’s published by the Committee for Skeptical Inquiry. Find out what genuine science has to say about the extraordinary and the unexplained. You’ll be surprised. Subscribe to Skeptical Inquirer today. One year, six challenging issues for nineteen ninety nine to subscribe a request, a sample issue. Just call one 800 six three four one six one zero or visit the point of inquiry. Website point of inquiry dot org.
Thanks for listening to this episode of Point of Inquiry to get involved with an online conversation about today’s show. Join us on our discussion forums at point of inquiry dot org. Views expressed on the show aren’t necessarily CFI views, nor the views of its affiliated organizations. Questions and comments on today’s show can be sent to feedback at point of inquiry dot org.
Point of inquiry is produced by Thomas Donnelly and recorded from St. Louis, Missouri, Point of Inquiry’s music is composed for us by Emmy Award winning Michael Quailing. Contributors to today’s show included Sarah Jordan and Debbie Goddard. I’m your host, DJ Grothe.