How biases impact science communication and democracy

CoverBetterjpg
How do biases change the way science needs to communicate? Picture made by Piyushgiri Revagar.

Dr. Andreas Kappes is a Post Doc at the Affective Brain Lab at University College London, which investigates how emotions and motivations influence our decisions, expectations and memory. I talked with Dr. Kappes about the implications of his research on people’s political decision-making and on communicating scientific findings.

Background. Biased assimilation and attitude polarization was first described by Lord, Ross and Lepper (1979). Biased assimilation describes the phenomenon that beliefs that we hold affect the way we process and assimilate new information. That is, we don’t  learn and process information rationally and neutral, rather our prior beliefs leads us to perceive confirming information as more valuable and true. This, in turn, leads to attitude polarization, which is the phenomenon that disagreement becomes more extreme, as different parties consider evidence on the issue. These processes have implications for how we can communicate science to the general public, how society makes democratic decisions and how scientists do research.

[] = signals a comment I made

Q: Could you tell us about your work on how we neglect information that disconfirms us?

I am interested in how motivation shapes information processing. In particular, I am interested in how desires and beliefs that we already have shape the way we integrate new information. So in a political arena this is very straightforward. What our lab finds is that, if people have a pre-existing belief, for instance about climate change, it will shape the way they subsequently seek and integrate information. So seeking information means what kind of websites they look for, what kind of magazines they read, what kind of newspaper they seek out. But even if they are confronted with disconfirming information, we find that people are really good at immediately neglecting that information. So it doesn’t seem to require a lot of effort or reasoning. People are very good at neglecting whatever does not fit into their worldview.

Q: How do these processes influence scientists who aim to seek truth?

Even as a scientist, while we are all after the truth, we often go with a certain hypothesis, whether it is good or not. But we go in with a certain hypothesis, and this shapes what we are looking for, what kind of information we are seeking, the way we set up our experiments and what other information we integrate [see Simmons et al. (2011)]. And so there is truth seeking, in realms that don’t touch certain believes that we already have, people are very good at that. But I think if the information somehow contradicts already existing believes people get worse at truth seeking.

Q: How hard is it to communicate alarming versus confirming information?

The short answer is that it is very easy to communicate confirming information, because people are very ready to integrate that information. It doesn’t have to be from a trusted source, doesn’t have to be considered as a valid information, just something that I am ready to integrate. Disconfirming information is very difficult to communicate in a way that people actually accept it. So, I think that is kind of the major finding of our lab. If you and I have an opinion about the same topic and we agree, I am very ready to integrate the information you provide me. I would think that you are a good, reliable source. A minute later on the same topics we disagree and now I completely dismiss your information. So even though we would expect that disconfirming information from a trusted source should be more easily integrated, we don’t really see that. We more or less see that people are very good at neglecting that information. So two things are important in communicating disconfirming information, which was also found by unpublished research from George Loewenstein’s lab. If you present yourself as not very confident, or present yourself saying “I am not really sure about this, this is difficult there is a lot of uncertainty”, people are more willing to integrate disconfirming information. In contrast, if you are a Bayesian updater than actually a person who disagrees with you the most, and is a really confident in that, should move your beliefs the strongest. But we find that a person that communicates disconfirming information with uncertainty is the person who has most influence on people’s beliefs.

The other thing is agreement, let’s say if you and I talk about Climate Change and let’s assume that you are a climate skeptic. And I kind of agree with you, the more we have an agreement on a higher level, an ideological agreement, the more susceptible are you to the information that I provide. So now I could move you a little bit towards a different direction. But if you disagree and you are confident, I will not listen to you and will not integrate the information you provide.

Q: But some pressing issues that scientist communicate are disconfirming some people, for instance Climate Change.

I think people have developed a sophisticated understanding of the motivations of science, and the downside to that is that they now understand that science is motivated to tell you one story. For instance Climate Change, there is a recent debates about that scientist who said that things are exactly certain to happen, by downplaying probabilities, lost more and more trust. Some scientist think we can’t communicate results with uncertainty, because people will not understand the complexity of uncertainty, so they represent results sometimes in a non-accurate picture. I think people are wary of that. I actually talked yesterday to a historian, and he said that is just the end of a very strange period, in which for 50-60 years people believed whatever scientist said. And now this period is over and we are back to normality like in the last 500 years, where just because a scientist said something people are still skeptical, and it takes more to convince them. He feels that maybe there is an unprecedented run for science in the last 50 years but now we gambled away our chips. Now it’s back to kind of slowly communicating, convincing people.

Q: How harmful is this psychological process for democratic decisions?

Let me present you one idea that recently came up. Formulated by two researcher Sperber and Mercier [in their book “The Enigma of Reasoning“]. They think that this is not problematic at all, because it reflects that people are good in making decisions in groups, but not on an individual level. So their idea is that we are evolutionary designed to reject information, because the risk of being taken advantage of, by false information is too high. So we just keep on rejecting information. Let’s imagine, we are a group of 40-50 hunter and gatherer, and I think we should go left, you think we should go right because you think that that is the best way to hunt and I think the other way, and now we both argue on our case. So what I have to do is I only have to look for the information that confirms my belief and disconfirms yours. I basically cut all the information space in half, and you do the same, and now we argue. All the other members of our group listen to that and at the end we make a decision. So by having a very strong confirmation bias in our discussion, we generate the best arguments for each side, because I am looking to take you down, and you do the same. You and I are not interested in being convinced. If we have public debates like that and then at the end we finally make a decision that includes the different evidence, then I think it works fine. But the problem for a democratic process is, of course, if the issues become so complex that not everybody understands the policy, then you just look for ideological flags, and disregard all the other information, due to the complexity of the problem. Then I think it becomes problematic for a democratic process, but it is hard to see an alternative to that.

InformationOverload
Can we process all the information that is out there? Maybe our biases help us to cut that space into processible units, and enhance group decision. Picture made by James Marvin Phelps.

Q: Recently we see this rise of people, like Trump who question objective facts, how can this be an adaptive process?

There are so many layers to that. The word of the year post-truth makes a process explicit that is going on in our brain: “Just because you say something I don’t believe that”. I don’t know if this is just more explicit, if we just understand it more, or it is actually exaggerated. It is very hard to gather, it is not as clear-cut as one would think. It is not that we can’t agree on anything anymore. This is part of human history, we always have some fundamental disagreements. Some argue that political polarization is on the rise, others have argued it is not. My impression is that it is not as straightforward as one would expect. So I think sometimes we have to be a little bit careful in pointing fingers at those that are not agreeing with our ideology and with our political beliefs. I am not sure if this is so new, or threatening.

Q: Do you think the information age and a liberal society influence this process?

There is some research on that [e.g. by Del Cario et al.(2015)]. It is called the Eco chamber, that basically Facebook tailors everything to your beliefs so that you only get that information. And that this enhances this human tendency. However, there is an article out [by Boxell et al. (2017)] where they show that the older you get the more polarized your opinions are, and that polarization does not rise fastest among those who use the Internet. People in their 70s, 80s who use Internet the least, have the most polarized opinions. It’s always easy to blame one thing, I think it’s more complex. There are also really polarized people, who have very strong, in my point of view, bizarre beliefs without ever using the Internet. Regarding the liberal society, I am not sure, you would predict that because there is more information and diversity of opinion, you should have a more balanced worldview. That is not the case. But I don’t know if a less liberal society would lead to less polarization. That is hard to measure, because there is less expression of different believes in authoritarian societies.

Q: So its like a circle in which our prior beliefs influences our information seeking, which in turn, influences our beliefs. My last question is when do people then change their opinion, or get convinced of another viewpoint?

In essence there are more beliefs that we have on which we never change our opinion on, than beliefs that we change our opinion on. I rarely meet a person that says I used to be conservative now I am liberal. In the political realm, I think it is rare that we change dramatically our opinions. Sometimes there are undeniable facts or realities where you have to adjust to. I think that gradual change happens, so if you think about the circle, if there are good, confirming information of your belief that were in the line of things that are important to you, it is easy to integrate that, you integrate bad news less. So if there is a dramatic shift in news that support you versus disconfirm you, I think over time people would change.

Art
Can art help to change people’s attitudes? A graffiti in the city center of Valparaiso (Chile).

There is probably also some empathetic responses that can change our beliefs. Paul Bloomer argues that there is a role of literature and culture, so it’s not so much that you are saying I am going to change your mind about something. Rather you see the portrait in a fictional story about, for instance, a gay character and by empathizing with this characters you change our mind. However, it is less likely that a person who is anti-gay reads a story like that.

I think it is a slow and painful process, and at the end of the day, there is room for reason. That is why we have reason. That is why people are receptive to arguments, even when they are disconfirming. I think that is why our reasoning is the last step to bring in new information. So there is a level of understanding if you have a really good point that is indisputable, you will win people over. I don’t think this is done by emotional responses, but by reason. Reason is there to change our minds, and it is an effortful, slow and painful process. I think there is tremendous progress on so many areas such as gay rights, human rights and fighting poverty. We shouldn’t forget that.

Thank you very much!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s