Dan Kahan – The American Culture War of Fact

February 14, 2011

Why do Americans claim to love science, but then selectively reject its findings when they’re inconvenient? And why do some cultural groups reject certain types of scientific findings (about, say, harm to the environment), whereas others reject others?

Yale law professor Dan Kahan is doing some of the most cutting edge work right now when it comes to figuring this out. Kahan is trying to resolve what he has called the “American Culture War of Fact,” by determining how it is that our core values-whether we are “individualists” or “communitarians,” “hierarchs” or “egalitarians”—can sometimes interfere with our perceptions of reality.

Most intriguingly—or, if you prefer, disturbingly—Kahan has found that deep-seated values even determine who we consider to be a scientific expert in the first place.

His results have very large implications for how to depolarize an array of scientific issues-and how to communicate about controversial science in general.

Dan Kahan is the Elizabeth K. Dollard Professor of Law at Yale Law School. In addition to risk perception, his areas of research include criminal law and evidence. He has served as a law clerk to Justice Thurgood Marshall of the U.S. Supreme Court (1990-91) and to Judge Harry Edwards of the United States Court of Appeals for the D.C. Circuit (1989-90).

Today’s show is brought to you by Audible. Please visit Audible podcast dot com slash point to get a free audio book download. This is Point of Inquiry from Monday, February 14th, 2011. 

Welcome to Point of Inquiry. I’m Chris Mooney. Put him inquiry is the radio show and podcast of the Center for Inquiry, a think tank advancing reason, science and secular values in public affairs and at the grassroots. At the outset of our show, I want to remind you the point of inquiry sponsored by Audible Audible’s, the Web’s leading provider of spoken audio, entertainment, information and educational programing. The site offers thousands of books for download to your computer, iPod or C.D. And today it’s willing to give you one for free. To participate, all you have to do is go to the following Web site, audible podcast, dot com slash point again. That’s audible podcast, dot com slash point. Let me make a recommendation. Download a book we just featured on the show. Seth Mnookin, is the panic virus a true story of medicine, science and fear? I checked article has it. You can get it right now. Just head over to Audible podcast dot com slash point and click a few times. I think you’ll be glad that you did. My guest this week is Yale law professor Dan Kahan. I wanted to have him on because he’s doing some of the most cutting edge work right now when it comes to figuring out why people sometimes reject the scientific facts that you and I might consider obvious. Cons trying to resolve what he has called the American culture war of fact, and he’s doing so by determining how it is that our core values sometimes interfere with our perceptions of reality. His results have very large implications for how did depolarize an array of scientific issues and also how to communicate about controversial science in general. Dan Gihan is the Elizabeth Kay Dollard, professor of law at Yale Law School. He studies risk perception and also criminal law and evidence. He served as a law clerk to Justice Thurgood Marshall of the Supreme Court in 1990 and 1991, and also to judge Harry Edwards of the United States Court of Appeals for the D.C. Circuit in 1989 to 1990. Dan Kahan, welcome to Point of Inquiry. Thanks for having me. 

You study something that is called sometimes anyway, cultural cognition of scientific consensus. It’s a bit of a mouthful. Can you explain what that is? 

Sure. And actually, though, the way to think about it is it’s really a kind of answer to a puzzle. And the puzzle is, how do we know how the world works? How do we figure out whom to trust? To tell us what to believe on all kinds of matters that obviously have an impact on the lives that were not in a position to figure out for ourselves? We have to have some way basically to get the message. Scientists are collecting lots of data. How are we supposed to know what it is that they found out on and whether we can trust it? Cultural cognition is a theory that connects that process to certain kinds of group commitments and values that people have that basically create networks, networks of confidence and trust in them. So people basically get the message from the people who are part of this group. And usually the groups, they’re different groups. We have different measures for them. Usually they converge. They they come to the same understanding about what scientists are saying and the right. But but some of the most interesting conflicts we have are ones in which the groups people are relying on to help them get the message about what science can tell them they’re getting their wires crossed. 

And what are the actual groups that we’re studying here? 

Well, we use a framework that characterizes people based on their values. It was developed by an anthropologist named Mary Douglas, and she has a framework that characterizes people’s preferences or values along two different dimensions. On the pretty intuitive, one is how individualistic or group oriented they are. And the other is how much they like to have stratification ranking kind of orderings and how much instead they just prefer to have a kind of equal playing field. Not a lot of authority. So those are pretty basic measures of just outlooks people have. And then we use those to formulate predictions about what kinds of of perceptions of risk they’re going to have. 

So let’s let’s let’s apply this then we’ve got these different values and then they get their wires crossed about something scientific. 

Sure. I mean, let’s take a look just as an example of the basic mechanism. If you’re somebody who has very individualistic values, then one of the things that that you prize about life is just spontaneity, public private ordering, individual initiative. When you believe that these kinds of things are going to be producing harm for society, that that creates tension. That’s not something that that it would be a happy thing to believe. So you predispose not to accept that kind of information. If you’re somebody who’s more egalitarian, you’re probably already somewhat suspicious that too much private ordering commerce and industry are the sources of of social disparity. You’re more inclined to accept that those activities can actually cause harm for society. So these groups start out with these different predispositions. And when there’s evidence that is available on one side or the other, they’re drawn towards that. And if they perceive that there’s an issue where basically it can only come out in a way that that is contrary to their values, that’s going to create that tension. Something like climate change can be seen that way. If climate change means that, you know, sir, the party’s over for individuals, we’ve got to cut back. Well, that’s it. That’s a kind of threatening thing to believe. You can imagine, though, that there will be ways in which that group wouldn’t have to understand that information in the same way you wouldn’t have the wires crossed, as it were. Right. Maybe think about something like nuclear power as as one of the potential responses to climate change. Well, nuclear power has associations that are positive for individuals, kind of the idea of almost permanent, inexhaustible supplies of energy to fuel these private orderings. If you believe that accepting climate change means. Well, that’s the solution. You’re not as threatened by it. You consider the information more in a more open minded way. The wires cross when you don’t have stories that are available to the groups that make it possible for them to see the information in question as not doing not. That’s putting them in a position of tension or conflict with their basic commitments. 

You know, I was just at a conference and this finding about nuclear power that is yours came up in a panel on, I guess it was communication about climate change. 

This was a science blogging conference. And I guess we were then trying to get precisely right what it was that you had done to show this. You showed people two different kinds of newspaper articles. 

Sure. Well, we did one study. We did involved basically showing people newspaper articles that had what amounted to the Intergovernmental Panel on Climate Change findings on climate change that that that that that kind of changes is occurring, that humans are causing it, that it’s going to have bad consequences, that the experimental component was that we buried what we represented. The report recommended as a solution in one in one group of subjects, the report that a newspaper article emphasized the need for more carbon emissions in the other. It said the society should explore alternatives to fossil fuels, including nuclear power. And so we compared what the individuals in the two groups believed about the information that the newspaper report was attributing to the scientists. The group that got the version of the article that said nuclear power was one of the solutions. Not only that, they were more likely to believe that climate change is happening, humans causing it, and that that it was it was detrimental. And then the group that got the information about the need for additional carbon emissions, the group that got the the version of the report that said we need more carbon emissions, they were even more skeptical about climate change than a control group, one who just read a report about the need for a stop sign on Elm Street. So you as an example of actually giving them more information that made them resist more, made them draw the opposite inference because of the what we would understand is the threatening implication of that information. 

And you said I think you may have misspoke. You said more carbon emissions, but I think you meant more. 

I’m sorry, I bent limit limits on carbon emissions, right? OK. 

So so they get that. Yeah, they get the information. They get the scientific information. Packages is saying it supports a regulatory goal of interfering with the economy. 

And they don’t like the science. They get the science packages saying it supports a free market goal. And then they like the science. 

Sure. Well, let me. Now, here’s a here’s an example of this, too. I mean, we haven’t done it. I mean, this is a hypothesis. But it but this is the way of that that this form of this theory of risk communication invites you to think. But I work with somebody, Mike Jones, who does a lot of stuff on narrative. And what he shows is that people malhi understand information better if they’re characters, if there’s some some kind of drama attention that they can identify with, that there’s a happy solution, but that people have different values, have different narrative scripts, different kinds of of characters that are appealing, different kinds of drama, different kinds of happy endings. There is an individualist script narrative into which climate change beliefs fit perfectly well. We are essentially the species that manages to kind of defy Malthusian limits. And then we’ll get to that little top of the curve. We’ll crash down. You know, we shift the curve over. We do something to change our environment. An example would be something like cholera. You used to think that the maximum density of a city you could network to a thousand people, people die of cholera or something like that. Now, how many times that number. We did something basically to get rid of our own wastes. You know what? We’re at that time, again, where we’re at a point where we have to do something to accommodate the byproducts of our own ingenuity. And that just requires not cutting back, but doing more of the same. Doing something, some kind of technological response that geo engineering then fits into that narrative. It’s not. But the party’s over. It’s more of the same. Now, of course, you know, more of the same. Might be that the bad message for the other group. But there’s no reason. There’s no natural antagonism between scientific information and values. The problems you see in society are when the stories that we have get kind of depleted and take on these valances that really put the groups in detention. 

And I guess another timely issue where this happens that’s not quite as explicitly scientific was certainly as a lot of economic or social science research to it. 

I know you’ve looked at is gun control and this same individualist egalitarian split happens there, right? 

Sure. And you know that the basic idea is that people who have certain hierarchical values, that they have lots of social roles, we’re having a gun enables particularly men to do occupy valued roles, father protector, hunter, so forth and so on. They have the biggest stake in having guns be understood to be Lycett having access to them. The idea that guns are dangerous is therefore something that they’re more likely to resist. Egalitarian people tend to believe that guns are connected to racism or to sexism. Communitarians think that guns are kind of a gesture of hostility towards strangers. These do then predispose people to look at information in ways that confirms what they already tend to believe. I mean, the amazing thing, we didn’t do a study of this, but I’m sure somebody will do it because it’s the same story every time you have a shooting tragedy. But after I’m pretty confident that after something like the shootings in Arizona, you’re not going to see any changes in public opinion pass the people draw the conclusion, inference that this is because we have too little gun control. Somebody like like Laughner got a gun, but the other half say, well, if we we have too much gun control, people should be if we had people able to defend themselves in this situation, it wouldn’t happen. And they if they actually notice many more instances of examples or people use guns defensively. So you see you see the same thing there. People might agree that they want a policy that promotes safety, but because of their values, disagree on what a policy can do that. 

And I guess something our listeners might be wondering is just the basic. What is this kind of research doing at Yale Law School? How did you become interested? 

Well, I mean, you know, at school, if it’s yellow. School is the site of all kinds of interesting research. But really, the. Kind of anticipated, you know, serendipity of being at university. And the people I did this work with included my next door neighbor, who is in the psychology department, Jeff Cohen. He’s now at Stanford. Don Breman is one of my collaborators with a anthropology student. My first met him and he’s since gone on to give an academic position. But we wanted to do work to try to understand why, when people agree, seem to agree about what the goals are of all kinds of public policies and what they want to accomplish with law, they nevertheless end up divided and not in random patterns, but in patterns that reflect their values. Now, if I made one point is that there is a tendency here and this is a different kind of bias. But it kind of to focus obsessively on the bad cases that take nor the what I would say, you know, that the denominator here, these are pathologies. If you think about it, we’re not a society that resists science. There aren’t really identifiable groups that have a lot of influence out there that just say science doesn’t have anything to anything going for it. They don’t know what they’re talking about. Ordinarily, when we get the work from scientists, we fall into line. Think about something like that. The H1N1 vaccine, CDC says to do it. People don’t polarize on that. They don’t question that. They trampled down the old people and the schoolchildren were supposed to get the vaccines first. In the rush to get the vaccine, when they hear something like the HPV vaccine, which is for schoolgirls so that they don’t catch the human papilloma virus, a sexually transmitted disease that causes cervical cancer, there’s cultural conflict about that. Why is that? Why do some issues provoke this kind of conflict and not others? The number of cases that don’t actually outnumber the ones that do the ones that don’t are the ones in which these groups that I’m talking about, they they have their own narratives. They have their own ways of certifying things. None of them is crazy. They wouldn’t exist. It wouldn’t last very long if they were. And so naturally, they they conversion with the best information, as these other cases are ones usually as a result of misadventure, although sometimes that is fortified with a certain amount of mendacity and interest, the groups end up with understandings that just can’t be reconciled. 

Let me drill down a little bit more into the groups themselves, the terms you use and use. You said all them, I think, individualists, hierarchs, egalitarians, communitarians. 

And I think one thing again, our listeners may be wondering is, you know, why don’t we say liberals, conservatives? Why don’t we say Democrats, Republicans? After all, if you look I mean, I know global warming. Well, if you look at Democrat and Republican, you can pretty much predict what their views are going to be on that on the science of that topic. 

Well, there’s a there’s overlap between these categories and ideological measures. We find that, in fact, the measures we use are a lot better at predicting how people are going to respond to various kinds of risks when they don’t actually have a lot of political sophistication. I mean, this is something that actually is pretty well established in the public opinion literature that the liberal and conservative are best for people who are who tend to be plugged in. 

And a lot of these issues, something like the HPV vaccine, you’re not going to get a lot of traction with liberal and conservative that you can get with these measures. But it isn’t really that there’s really no need to choose. Very likely, the same dynamics are explaining why it is. You see liberal conservative Republican Democrat division as explain what’s going on in our in our model. That the key is to understand that people’s values do interact with the way in which they try to make sense of sense of the world on some issues. That’s going to be a harder thing to uncover than on others. But there’s no real reason to, you know, to put them into conflict. 

It is a case, though, too, that there there’s some divisions among Republicans that you can pick up by emphasizing the individualism and the communitarianism that that kind of a vade, simple liberal, liberal, conservative or Democrat, Republican. But I’m not I’m not partizan, as it were about that. 

Mm hmm. Well, I get what you’re saying about nobody being anti science. You know, if there’s you know, I’m broadly speaking, liberal. So I’ve disagreed with George Will on what he writes about climate change. But then he’ll write a a column about how much he loves scientific research or Newt Gingrich again, you know, he’ll he sort of loves science. And then I’ll say, well, but you did this and this and this. But he will still you know, he’s says all these pro science things. So I get how nobody is anti science. The question I have, though, is when people do split because of their cultural differences over what science says, they’re splitting over what the what information is true. They’re splitting over who’s an expert, who’s a scientist. Is it all of the above that they’re splitting over? 

Well, it did. The remarkable thing is that in these conflicts, you don’t see people arguing about whether science should be authoritative. People on both sides agree. I tend to agree that science should be authoritative, but they disagree about what the science says. One of the studies we did examined what people’s beliefs were about scientific consensus, not just on climate change, but on a variety of issues. We looked at, for example, people’s opinions about what scientists think about deep geologic isolation of nuclear waste. We looked at what they thought scientific consensus was on whether gun control laws reduce crime or instead increase crime by disarming law abiding people. And we found that people perceive scientific consensus was in line with the position that was most congenial to their values. We picked these issues, moreover, because they were ones in which the National Academy of Sciences had issued scientific consensus reports. And we use those as a kind of a benchmark to assess the patterns. And no one’s no one group is any better than another. The same people who might be right, as it were. If you take the National Academy Science Consensus Report as the benchmark on climate change are wrong, went on about the nuclear waste disposal and the people who think that there’s scientific consensus. And there has been for decades, according to the National Academy of Sciences, that disposing of waste in deep geologic isolation is safe. They think there is not scientific consensus that you see there isn’t on climate change now. How to explain a pattern like that? Maybe one of the groups is better at knowing what most scientists think than the National Academy of Sciences. But a better another hypothesis is that both groups are conforming to the information they’re getting about what scientists believe to their values. We didn’t experiment in this study to that that that does on a test that hypothesis. We found that when people were asked to assess. Whether a particular featured scientist was an expert on either climate change, nuclear power or gun control, their answer was predicted by the fit between their values on the one hand, and the position that we attributed to the expert on the other side. Think about it. We’re all kind of engaged in our own amateur public opinion, our own our own polling of experts. What do the experts believe about this? Every time I see one, I mark it down. The problem is I’m only counting the ones who are saying what I already believe. Even if these people agree that scientific consensus should be controlling on these issues, they end up with skewed understandings of what most scientists understand. That’s that, I think is what you see. Not not science versus anti science on these kinds of issues. 

It’s really interesting. 

And this would explain why on climate or on evolution or whatever the the anti evolution is that the people who don’t accept the climate science, they put out these long lists, you know, a thousand experts question Darwin or something like that. And they and they and they get people to sign onto them. And it makes perfect sense, I guess. 

Well, I mean, and it is you know, I think scientific consensus is important, but it is one of the reasons that science is a dynamic as it is, that it actually is encouraging of dissenters within it. Scientists say you don’t take things that authority. 

You take things on proof and you only you believe things when they withstand attempts to challenge them. So they reward internal to science and they should a certain amount of debate. But if that’s if that’s the logic of science, we’re never going to run out of of dissent. And it’s never going to be the case that nose counting settles these kinds of things. I mean, somebody who is right, you know, who was in the minority at one point. So this is this is a challenge in that respect. 

You know, it’s really interesting is this also reminds me of the projects, Steve, and I know if you’ve heard of that. But this is the way of lampooning this idea of getting a list of scientists who agree with you and putting it out there, because the National Center for Science Education effort, the creationists did their list of scientists who challenged evolution. 

They did a list of scientists whose name was Steve Who. And then they got like a ton of those. And it raises the issue as part of the problem that we probably have in society, in the world, many more PGD than we had 50 years ago. 

You can find one to say anything. 

Yeah. Yeah. Well, I mean, that that’s probably true. 

And I don’t think, though, it’s never been the case that people have done their own accurate census on what most scientists believe they’re getting even that information from people that they trust. It’s not any different from anything else. It’s not the case, though, that that that that somehow is ineffective or unstable. Could that we have lots of PGD and most of the issues that are of consequence to us are many of them. You know, we don’t have this problem. We should figure out why it is that there are some issues on which the possibility of finding this kind of disagreement is going to result in persistent conflict. It’s not just that it’s really not a problem that scientists debate each other. It couldn’t possibly be a problem. 

Well, one thing, again, especially for listeners of this show, and I brought up evolution a little bit, where the resistance to the science is, I think to a large extent religiously or to some extent ethically impelled. What is the real interaction between religious belief and these four cultural groups that you’re studying? 

Well, actually, that that’s extremely interesting question. If you’re more fine grained about it, imagine egalitarians. You have both a secular group and a more religious group. Most environmental issues. They see things exactly the same way. So that’s not that that’s not a tremendous consequence. The hierarchs tend to be more religious right. So they will tend to mean that that’s more uniform. But it isn’t the case that they have less science literacy. Their attitude towards science tends to be to be more antagonistic. So it’s hard to know whether religion, per say, is driving this. What religion means about risks and how it connects people’s values. What it even means the orientation is going to be towards science, I think is more complicated in an interacts with values. 

Well, I mean, obviously they’re certainly going to be other motivators besides just these being part of these four groups, because there’s other cultural groups that you could be. 

I mean, you could be a man or a woman. I think that would probably change the way you view some issues just in and of itself. 

It was it’s interesting that I mean, it’s it’s a it’s a staple of the risk research literature that white men tend to be less concerned about on all manner of risk than either women or minorities. But it turns out that that that really is driven by the interrelationship of the cultural values and gender, the kinds of things that we’re looking at, risks from commerce and industry or say risks from guns. These are the kinds of activities that are especially important to white males who have hierarchical and individualistic values. They’re so decidedly resistant to the idea that those are risky things that if you don’t take into account what the cultural values are, it looks like it’s just a generic white male effect. In fact, if you exclude them, you look at the egalitarian groups, you don’t see the white male effect there. There’s not a difference between men and women. 

So, you know, I think that gender gender certainly is important, but I don’t think it is. I don’t know if I would say men and women persay have different views. I want to know why it is that men and women are with you. What is the nature of the issue and how is that going to make some men and some women see things differently? 

Clearly, all of this has implications for how you communicate about science, how you get the information across. And I guess the implication I take away is there’s this dastardly word framing that’s controversial. But I mean, I think that’s where we’re headed. It sounds like you don’t just put facts out. You have to know what the narrative is. You have to know what narrative appeals to what cultural group. And then that seems to be determinative of how you try to reach them. 

Well, I mean, it’s certainly it’s not enough just to kind of bombard people with facts in facts have meanings. And you want to make sure that the meanings that the facts are conveying are not ones that are going to provoke a kind of closed minded response. 

The narrative, I think, is, you know, is one element of this. But another is just the kind of cue that you get by the connection between the identities, the culture identities of of advocates and the positions in these debates. I mean, one of our studies of the HPV vaccine, we showed that even though people had a cultural predisposition, they were much more sensitive to the perceived cultural values of the expert scientists to whom we were attributing the positions. If you saw the person who shared your values and that person was taking the position, that’s normally associate with your cultural view and you saw the person who you might think was kind of strange advocating the position that was was you’re predisposed to reject. That was a compounded effect. And you see tremendous polarization. But you could flip that around if all of a sudden you’re getting the information that you were culturally predisposed to reject coming from somebody who you think is more like you. And you see the argument you’re predisposed to accept coming from somebody else, you tend to change your position or at least think a lot more. Now, we can’t go around orchestrating the people’s perceptions of who’s on what side. But one thing we probably can do is avoid creating the misimpression that everybody on one side has the same kind of cultural outlook. I mean, in this study, too, we found that when when people were getting the information from advocates whose values were either equally close to theirs or equally remote from theirs, then cultural polarization was also significantly reduced. So here’s an idea. Know work at making the face of communication plural. Don’t. Don’t fall into the trap. Whether by accident or by somebodies actual design of having this look like an us versus them kind of issue. Essentially what at what I believe is a function of who I believe I have to trust somebody. So if you tell me that the thing I believe that the people is wrong, you’re telling me people I trust are stupid. That’s not a winning strategy. And you want to avoid having that kind of of of impression. I mean, I think climate change, sadly, did become branded in a way that was unnecessary. It can be undone, but we should we should avoid it with future issues. 

I just want to tease out one implication, because I’ve read some of your studies of what you just said. You said change the face of science communication. And if I can be a little bit flip. I think in some of the studies that actually depended, you know, what the scientists look like. 

And if they had a beard, then they looked more like inegalitarian, if they were clean shaven in a suit than they looked more like an individualist than that, you know, that would be changing the face. I guess scientists could shave before they communicate. 

Well, we we did have we did have fictional scientists and based in part on appearance and other things, people were imputing imputing values to them. I mean, this is an effect that you can observe in the experimental study, I think, in the real world. We understand that. Then we we just have an intuitive sense that people come across as having different kinds of of backgrounds, different kinds of identities. I don’t think we should go around using cosmetics on particular spokespeople. 

Yeah, I’m just I think I understand. 

But, you know, none of this requires manipulation. It just isn’t the case that all the scientists either have beards or don’t. But there if you if you are conscious to include the diversity of perspectives from the outset, you don’t have to do to resort to any kind of tricks like that. 

Okay. Well, excuse me for going in there. I of people can’t see me right now. 

They they should just imagine either do it, don’t have abuse, ever would make what I’m saying look convincing. 

Right. Exactly. Well what would be I mean I guess if I can ask you just sort of one concluding question. 

What would be a marker for you that show that, you know, communication wise? We had succeeded in depolarizing, at least to some little extent, one of these issues. 

Well, obviously, if you look at the amount of division in society, you know, whether there’s polarization, but I don’t think that the best measure is just whether everybody agrees. 

It’s whether everybody is engaging in a thoughtful and reflective, open minded way with with data that they’re not just dismissing it. They couldn’t do a study like the one that I did and get the ones we’ve done and to get these kinds of reactions. People now are deciding on the basis of very shallow heuristics. I mean, you mentioned the point about the appearance. It’s essentially like a team sport. And there are ways to understand whether people are actually thinking about the issue, reflecting on it, trying to make sense of it, or whether they’re just looking for the signs, the cues that tell them what their team thinks about something. And I think it’s progress. If you can show that people are, in fact, engaging these issues in a thoughtful way, it wouldn’t. It’s not. And that the test shouldn’t just be. Does everybody agree on these things? And I think, too, it kind of demeaning to democracy to think that people aren’t going to critically engage with every all the information they get from whatever source. 

So I guess what you would want to see ultimately is a a different tone of dialog that shows more deliberation. 

Well, I think a different tone, although I’ll tell you what, I think the most important policy would change would be to anticipate these kinds of things. I mentioned the the example with the HPV and H1N1. You know, in in some sense, it it’s obvious after when that when you say you’re going to you’re going to have vaccinations of school girls for sexually transmitted disease, that you’re going to be provoking a certain kind of anxiety on the part of people who think you’re just trying to. The state is condoning something that’s the more wealth that could have been foreseen. Why wasn’t it and why wasn’t something done to present the vaccine in a format in a program that didn’t provoke that? Well, you have the problem we have with climate change. It’s hard to undo it. But if we do get through this and I think we will hopefully arrest, we’ve learned is that we shouldn’t we should avoid having this happen in the first place. To me, it’s a sign of progress. If when you have something like synthetic biology about which most people know nothing. You’re thinking about how you’re going to be communicating the information. What kind of resonances could it have that you can head those off at the same time that you’re doing the most basic development science on it? And I think that that’s something that a democratic society is trying to avail itself of the best information, as is going to want to be part of its its policymaking process. So, you know, that’s the thing that would make me the most heartened to see that where we’re taking science communication seriously, as seriously as the science to develop information about the benefits and risks of these things. 

Well, I think that is a very worthy goal. And I hope that the scientific world, the research world is turning in that way. 

And I think maybe it is. So with that, you know, Dan Kahan, I just want to thank you for a very thoughtful conversation on point of inquiry. 

Thank you, Chris. 

I want to thank you for listening to this episode of Point of Inquiry to get involved in a discussion about Dan Garns research. Please visit our Web site by going to center for inquiry, dot net slash forums and then clicking on point of inquiry. The views expressed on point of inquiry aren’t necessarily the views of the Center for Inquiry, nor its affiliated organizations. Questions and comments on today’s show can be sent to feedback at point of inquiry dot org. 

The inquiry is produced by Adam Isaach in AMR’s, New York. And our music is composed by Emmy Award winning Michael Whalan. Today show also featured contributions from Debbie Goddard. I’m your host, Chris Mooney. 

Chris Mooney