Christopher Burns- Deadly Decisions

January 16, 2009

Christopher Burns is one of the country’s leading minds on modern information management. He has been a news executive and consultant to government and the private sector for thirty years, advising clients on emerging information management technologies and the evolution of the information economy. His previous positions include vice president of the Washington Post Company, senior vice president of the Minneapolis Star and Tribune, and executive editor of United Press International.

In this interview with D.J. Grothe, Christopher Burns talks about the biology of the brain, the behavior of groups, and the structure of organizations and how each can lead to people making bad decisions. He discusses the paradox that in the age of information, it may be more difficult to make good decisions. He describes “false knowledge” and how to choose the right information to pay attention to. He emphasizes the value of skepticism in making good decisions, and of trusting ambiguity and uncertainty. He uses the example of the sinking of the Titanic to explain the concept of “information errors.” He discusses how groups naturally discourage dissent, and how this harms the information system, citing examples from operating room and airline cockpit. He details ways of organizing that lead to better decision-making. And he talks about the political domain, and how to address challenges to good collective decision-making in a democracy, contrasting the Bush and Obama administrations.

This is point of inquiry for Friday, January 16th, 2009. 

Welcome to Point of Inquiry, I’m DJ Grothe key point of inquiry is the radio show and the podcast of the Center for Inquiry, a think tank advancing reason, science and secular values in public affairs and at the grassroots. My guest this week is Christopher Burns. He’s been a news executive. He’s been an independent consultant to government and the private sector for 30 years, advertising clients on emerging information management technologies. He’s an expert on information management. His previous positions include vice president of the Washington Post Company, senior vice president of the Minneapolis Star Tribune and executive editor at United Press International. Welcome, Christopher Burns, to a point of inquiry. 

Well, thank you very much for inviting me. 

Christopher, your main point in this book, Deadly Decisions, is that many of the worst decisions we’ve made as a society, as organizations, they’ve had to do with information systems and not what? Not using the information systems. Right. Here’s the paradox. This is the age of information when it’s supposed to be easier than ever before to have right at our fingertips. All the information we need to make the right decisions. You’re not really seeing it that way? 

No, not at all. I started off thinking, along with everybody else, that the information age meant terrific new technologies. And then I realized that the big mistakes we’re making are not related to technology. They’re related to our ability to figure out what is true and what is not true. That as more and more information becomes available to us and we’re trying to make these decisions, we can’t tell any longer what information is true and what is not true. And the problems we’re having is partly that the mind isn’t wired to handle this much information. So we make all sorts of reasoning mistakes. The second problem is that when we work together as groups, groups make mistakes. And finally, we are organized in ways that make it more difficult rather than less difficult to get to the right decision. 

So let’s take each one of those first. Let’s just talk about separating the wheat from the chaff. Here you are, one of the nation’s leading minds on modern information management. So how do you tell what information you should be ignoring, what information you should be taking seriously? 

Well, there are a few alerts. One of the things everybody does. We tend to hear the things we want to hear and not hear, the things we don’t want to hear. We tend not to deal with ambiguity very well. So if somebody says that something may or may not be the case, we tend to decide quickly which it is. And that’s what we remember. There are a number of individual reasoning steps that we go through that lead us to false conclusions. And we can learn through practice and through through discipline, we can learn to be more skeptical about the information we hear, we can learn to be more respectful of uncertainty and doubt. We can learn to listen to people who are warning us about something we don’t want to hear. All of these are characteristics of big information, disasters like Three Mile Island or the explosion of the shuttle Challenger or the failure to hear the warnings about 9/11 or even the financial disaster. 

I want to talk about some of those specific kind of deadly decisions or ignoring the warnings that you just mentioned. But talk to me about false knowledge. How how does the notion of false knowledge enter into your argument? 

Groups develop ideas about how the world works and they believe those ideas. And when information is brought into the group, that is contrary to those ideas, that’s ignored. So a group to take the 9/11 example, the group at the United States federal government believes that the real threat to the United States would come from nations and they didn’t believe that any real threat would come from a group of people living in caves in Afghanistan. So this very strong feeling that our national security threats were state based. Made it difficult, in fact, impossible for the counterterrorist people to say, listen, there are al-Qaeda people living in Miami and Denver and they’re taking flight lessons. And Wolfowitz, among others, would say, well, what what danger could these people possibly present? Who cares about a little terrorist in Afghanistan is his direct quote. So their concept that our national security threat was state based stood in the way of them. 

Understanding an alternative, another national another national threat to our security came from individuals who didn’t wear uniforms, who are not from this state. 

So here we had all the information we needed, that the attacks were likely imminent. We knew, or at least some information gatherers knew that there were threats, but the information was ignored because of these preconceived kind of commitments. Right. 

Exactly the same thing happens to companies. It it happens to every organized group. A company believes that if the technology is terrific, that the market is as ready now as it was 10 years ago and that and they feel proud of what they’ve accomplished and optimistic about their their future. 

And that tends to prevent them from hearing somebody coming in from the field, a salesman, a new engineer, a person who just joined the company saying, you know, there is better technology than what you guys are using. But this concept gets in the way of our understanding. 

Some of this seems to me like Monday morning quarterbacking. You can look in retrospect, you say, oh, I see how they made these bad decisions. They weren’t paying attention to this stuff over here. But when you’re in the throes of it, how do you know what’s the right stuff to pay attention to? 

I don’t know that you can know the right stuff. I think what you can do is be skeptical. I think you can say I am going to listen to warnings, even though I don’t like what I’m hearing, because the very fact that this person is standing there in front of me in this room telling me this means that there must be something real. I think we can learn to trust ambiguity and uncertainty and say, well, we don’t know whether that’s true or not. I think we can learn to take information much more seriously and to hold each other accountable for reporting accurately what happened. Don’t simplify it. Don’t tell a story in a way that makes you look good. It’s not a matter of spotting the truth when it when it comes to you. It’s a matter of disciplining your truth detection skills. 

I asked you about false knowledge. How is that connected with information errors? What I’m trying to get to is how you argue in the book that the Titanic, the sinking of the Titanic, was really just what you’re calling an information error and nothing more. 

Well, the Titanic is an especially poignant example because we had a look out who who couldn’t see very well. 

He had had an eye exam in five years and he did wait. It wasn’t even given a set of binoculars. So he couldn’t see that far anyway, even though with a very clear night. 

But the second thing was, if someone had done the arithmetic, they would have realized that the ship was at that point traveling so fast with such momentum that even if the lookout had seen the iceberg, they wouldn’t have been able to turn in time. Mm hmm. And this was a case where the information systems they were relying on were lagging behind the technology. The ship was bigger than they had any experience with. It was going faster. It had more momentum. But the look out system had not caught up. It was still a middle aged man with bad eyes standing on top of a crow’s nest without binoculars. And a lot of this suggests that problems we’re getting into now into the financial system has become so complicated and so abstract that the tools that we’ve used to regulate the financial system can no longer keep up. Very simple mechanisms to measure risk for banks, for example, which were designed four or five years ago, were simply unable to keep up with the bundling and the re bundling and the abstracting of all of these mortgages. So the tools we relied on to tell us when we were getting into danger were just out of date. 

Earlier you were suggesting that it’s kind of a human brain. You talked about these three levels. You know, it’s individual kind of critical thinking problems then people in groups and then the way that we organize ourselves. Well, when you blame the human brain. You know, most people assume that it’s the human brain that helps us make right decisions. But you blame it for leading to these information errors. 

Yes. Yeah, I think, ah. There isn’t a release to coming in the human brain. It had limitations. When you confront a person with too much information, the quality of decisions starts to decline. 

Instead of continuing to get better there, there’s no way to do a software update for the night. 

No, there’s not a new operating system coming down the pipe. So we have to learn to live with what we have. The brain is actually wired to learn not to unlearn synapses, clothes. They don’t open again. So once a person becomes wedded to a concept of the world, it’s very, very difficult. From a purely mechanical standpoint, for the person to unlearn what he knows, you have to persuade him that what he knows is no longer accurate. 

You have to take him through a process. So there are thresholds in the mind. I mean, the pupils of the eye dilate at a lovely sight. The nostrils expand at a at a at a pleasant aroma. We block out sounds. There are all sorts of things about how the brain works and how the nervous system works that tend to bring us information we want to hear and keep information away that we don’t want to hear. And that is one of the fundamental problems that we make as individuals and trying to figure out what is a true description of the world. 

You say, though, that it’s not just the human brain individually. It’s not just our structures in the way we think about stuff, but it’s actually exacerbated when people get into groups like committees or communities, companies, those kinds of organizations actually lead people to make bad decisions. 

Yes, they do. Groups do two things in particular. They tend as information goes up through a group from level to level, the people passing the information along tend to leave out things that are uncertain or unflattering, and they add things that they think will help clear up the picture in the course of this. The information changes. Sometimes it becomes radically different. 

It’s like that old game in in preschool or kindergarten, you know, operator where it goes around in the circle and it changes as it goes. 

Exactly. It’s the same thing. Exactly. And you could see that in the attack on 9/11, Vice President Cheney was told that planes were headed toward Washington and he ordered the Air Force to shoot down every plane in sight. And they tried to execute the order. And as it was passed up, the ranks of questions were asked and concerns were raised in the Air Force chain of command. When the command was finally issued, it said if you see a plane in the air, take down its tail number. 

So an important message. A group tends to conform information to the concept it believes. 

So when you have multilayered organizations, the truth has a hard time getting to the top. The second thing that groups do is they discourage dissent. Of course, they punish people who are whistleblowers. But even in teams like if airplane crews or especially hospital surgical teams, doctors and pilots will say, you know, shut up, I don’t want to hear your second opinion. I talked to a doctor who had been an OPG high end as a young man, and he said that when he grew up, there were doctors were throwing scalpels at each other in the middle of deliveries. It got so bad that the National Transportation Safety Board made a study of what was causing airplane accidents. And they discovered that the problems of crew communicating with their leader, with the pilot was a leading cause of airplane crashes Jim Underdown. And they developed a special training program called CRM Crew Resource Management, which taught the crew to speak unemotionally to the pilot. And it taught the pilot to listen to the crew. And when this. Skill was applied to surgeries at the Boston Children’s Hospital. The medical error went to zero in the operating room. 

So regarding medical error or pilot error, the kind of group decision making in in planes, it’s not just the name of the game that because doctors are pilots are only human. That error is inevitable. 

You’re right. It isn’t because they’re just human. It’s because we live in a complex world flying a 747 or an Airbus 380 like the one that Sullenberger put down in the Hudson last week. It’s a complicated project and you have five or six people handling lots of information simultaneously. You can’t sit in the pilot’s seat and say, I’m going to make these decisions. The rest of you shut up. You have to say to your navigator, get me a course to the Hudson River. You have to say to the radio guy, tell the people at the tower what we’re doing. 

You have to say to the person in charge of the cabin, make the decisions necessary to make those people safe. 

So you’re saying that Sullenberger crashing in the Hudson, it was a function of good collective decision making. It wasn’t just one guy. 

Exactly. And you know what? It turns out that Sullenberger was one of the people who taught crew resource management to his airline. One of the things that he did was to go around and train other pilots to do this with their crew. So although I spend a lot of time talking about the mistakes we’ve made. That was a classic case of a wonderful piece of information management, a wonderful piece of team decision making. 

So this suggests that there are smart ways of organizing people that makes us less prone to make information errors. What are kind of broad brush strokes? What are some of the ways that you’re arguing are the best ways to organize ourselves to make those good decisions? 

It depends on the kind of work you’re doing. But I’ll tell you about a project that the Department of Defense did many years ago. 

They wanted to know what kind of labs produced the best inventions. So they looked at one hundred research and development labs and they said what a successful ones have in common is that budget. Is it people? Is it strong leaders? What is it? And the answer was it was organization. The teams that were run by a single person didn’t do as well as people, as teams where everybody listened to everybody else. 

The teams that were collegial, that made decisions together were much more innovative, much more successful. They were behind schedule and they were sometimes over budget, but they did what they were supposed to do. If you’re flying a plane, you don’t want to do exactly that. You want to distribute the job and trust everybody to do to play their position to use a soccer term so we can organize it. Here’s the final thing that I think is most important. We have a tendency to think that one person at the top makes a decision that people gather data, they pass it up to the next level. The next level analyzes their choices. They pass it up to the top and the top person. But that doesn’t happen any longer. 

The person running the information technology department makes decisions that restrict the choices that the president of the company has to make. The person doing market research makes decisions that restrict the choices that the sales people have. 

So we aren’t in a world where one person makes a decision anymore. It’s the idea of a lone hunter disappeared 20000 years ago when people realized if you’re going to kill a wooly mammoth, it takes the whole village to track it. Hunt, kill. It’s skin it harvested. We’re at that point now with information. The complicated things can’t be decided by one person. We have to figure out how to work together as teams. How groups should learn to exchange information. How organizations should be structured so that uncertainty’s are respected. People are able to contribute what they know. We are able to recognize our preconceptions and not let them block us from seeing the truth. 

All of this is good advice, but frankly, it all kind of it seems really pessimistic. When you look at democracy, when you look at our society, it consists of these kinds of institutions that you’re saying are so prone to bad decisions. You look at the current state of affairs around the world, this kind of impending world economic collapse, the financial crisis we’re in, all of that’s a function of the kinds of bad collective decision making that you’re talking about. So where’s where’s the hope? What’s it like? How do we apply some of this stuff to get get out of this? Well, you see that I’m pessimistic. 

Yeah, I think I think you can get pessimistic if you read too much of this stuff. And I’ve been I’ve been pessimistic about it, too. But in the end, I’m optimistic. 

I think organizations are learning to work in a more decentralized, more collegial manner. I think. Walk around the campus of a big corporations. Now you find Microsoft, this is a good example. You find 50, 60, 100 buildings. It’s almost like a decentralized by function. You also find a lot you find things like crew resource management now surfacing and operating rooms where doctors are told to respect the opinion of nurses. Pilots are told to listen to their navigators. 

Do you see it being applied to democracy as a project? 

I do. I think that ah, I am. I think our democracy is not working very well right now. I think it’s too layered. I think there’s too much cross interest at the congressional level. And when you look at the statistics, you see that Congress doesn’t do what people wanted to do. And the numbers are pretty clear. What do we do about that? We can decentralize some functions like road building and dams and so forth. We can we can make legislators more responsive by putting in term limits. There have been some suggestions by students of democracy that we should have a council of a thousand citizens that take up the big issues and talk them through. I think there are a lot of possible modifications to our working democracy that will allow us to tackle larger and more complex problems. 

I don’t want to put you on the spot, but with Obama coming into office. Do you think that there’s a less of a likelihood for deadly decisions to be made? Or is it? I mean, on the one hand, it almost sounds like, you know, we got to start from scratch that everything’s just so messed up. 

No, I think that this administration is I think the Obama administration is seems more willing to listen to multiple opinions, which is a crucial piece of making the right decision. 

Whether that will play itself out into better national programs, I don’t know. 

But it’s a very good start. I do think that a key is that you get very good people, you put them in the job and you trust them to do their work the opposite. And I’m trying to say this positively. The opposite would be a very intentional, purposeful, directed program to get something done. So you always have this. This competition between getting it done and doing the right thing. 

You see both of those things in tension. 

Yes, I do. Doing the right thing requires us to talk to each other, to deal with uncertainty, to figure out what the right to to have a lot of discussion. A team of rivals approach where, you know, people of different points of view. Talk it through. Getting it done usually requires somebody to say the talk. Time for talk is over. Here are the four things we’re going to do. You go do these things. Everybody running a group, whether it’s a four person committee or a 40000 person company, is always on that tension, the tension between figuring out what the right thing to do is and then actually getting it done. 

And I think that if to simplify it beyond the point of truth, I think the previous administration has been focused on getting certain things done and maybe at the exclusion of getting those things done right. 

Yes, I think for. For the Bush administration, it was sometimes more important to Duke to get it done and get it done quickly and efficiently than it was to make sure that they were doing the right thing or doing something that many people agreed with. Whereas I think the Obama administration seems to be tilted the other way. It is a dialectic. You have to go back and forth. But getting it done for a long period of time without thinking about what you’re doing, without listening to the alternative ideas, without leaving your building and walking around in the outside world, that’s a very dangerous thing anymore. 

We don’t we can’t see or feel the things that we’re controlling anymore. We’re moving into a world of nanotechnology and molecular medicine and Internet intelligence that is very complicated is just beyond the ability of one person or even a small team to make these decisions. And we just got to learn to do them differently. And I think we can. 

Christopher Burns, I really enjoyed this discussion. Thank you for joining me on the show. 

Thank you for inviting me. Good questions. Lots of fun. 

Where can you turn to find others like yourself who appreciate critical thinking, turned to Skeptical Inquirer, the magazine that separates fact from media myth. It’s published by the Committee for Skeptical Inquiry. Find out what genuine science has to say about the extraordinary and the unexplained. You’ll be surprised. Subscribe to skeptical inquiry today. One year, six challenging issues for nineteen ninety nine to subscribe a request, a sample issue. Just call one 800 six three four one six one zero or visit the point of inquiry. Website point of inquiry dot org. 

Thank you for listening to this episode of Point of Inquiry to get involved with an online conversation, go to point of inquiry dot org. Views expressed on this show aren’t necessarily CFIUS, nor the views of its affiliated organizations. 

Questions and comments about today’s show can be sent to feedback at point of inquiry dot org or by visiting our Web point of inquiry dot org. 

Point of inquiry is produced by Thomas Donnelly and recorded from St. Louis, Missouri. Executive producer is Paul Kurtz. Point of Inquiry’s music is composed for us by Emmy Award winning Michael. Contributors to today’s show included Sarah Jordan and Debbie Goddard. I’m your host DJ Grothe. 

DJ Grothe

D.J. Grothe is on the Board of Directors for the Institute for Science and Human Values, and is a speaker on various topics that touch on the intersection of education, science and belief. He was once the president of the James Randi Educational Foundation and was former Director of Outreach Programs for the Center for Inquiry and associate editor of Free Inquiry magazine. He previously hosted the weekly radio show and podcast Point of Inquiry, exploring the implications of the scientific outlook with leading thinkers.