Make Science Great Again
Post-truth, the word of the year 2016, refers to “circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief” (Oxford Dictionaries, 2017). History is full of shocking examples of the consequences of following leaders who based real political decisions on false facts. Dictators such as Hitler and Stalin kept their nations reined in by oppressing dissidents and restricting access to information.
While today’s (Western) society benefits from scientific advancements and the freedom of speech, the current trends show that we are still prone to follow populist movements that largely base their campaigns on falsehoods – think the Brexit referendum and Donald Trump’s election (e.g., Kirk, 2017; Politifact, n.d.). Furthermore, the public heatedly debates about issues upon which expert scientists largely agree (e.g., climate change; Kahan, Jenkins-Smith, & Braman, 2011; Cook et al., 2016). These developments illustrate that science threatens to become “just another opinion” in politics and public debate.
…beliefs vary by our moral outlooks
It is commonly assumed that we fail to consider scientific findings properly because we neither inform ourselves enough nor master the complexity of political topics (Kahan, 2010). However, if the public’s lack of knowledge were the problem, beliefs on political issues would be distributed according to education levels. Actually, though, these beliefs vary by our moral outlooks. For example, the same groups that disagree on cultural affairs, such as same-sex marriage, also tend to disagree on whether climate change is man-made (Kahan, 2010). Hence, we base our opinions on our values rather than on our knowledge (Kahan, et al., 2012; Kahan, 2013). Stated differently, our interpretations of scientific information signify the kinds of persons we are – similar to a Rorschach inkblot test (Feldman, 2017).
We cherry-pick information that is supportive of our worldviews to protect our identities; After all, changing our defining values can threaten our standing in our tribes (Kahan, 2013). In contrast, we do not incur any costs when we endorse opinions that deviate from the scientific consensus. For instance, someone whose family and friends place value on environment-friendliness would risk exclusion if he or she would claim that climate change is a “hoax” (Kahan, 2010). However, whether the person is right about global warming will have no impact because his or her influence on the climate is insignificant anyway. Therefore, it is perfectly rational to align our opinions to our social groups’ viewpoints rather than to the scientific evidence, and smart people are even more likely to endorse this bias (Braman, Kahan, Peters, Wittlin, Slovic, 2012).
Although resisting information that threatens one’s values is rational for the individual, it leads to poor collective decision making, such as a convergence on policies that do not reflect scientific evidence properly. The prevalent interventions, which aim to inform the public especially about evidence that contradicts their views, only strengthen already held opinions (Hameiri, Porat, Bar-Tal, Bieler, & Halperin, 2014; Lord, Ross, & Lepper, 1979). Nevertheless, education can effectively decrease political polarization – under the right conditions.
Our mindsets also determine how biased we are toward political information.
To prevent resistance and ignorance, scientific information have to be presented in a way that affirms our values instead of threatens them (Cohen, Aronson, & Steele, 2000; Hameiri et al., 2014). For example, individualistic and industrious people tend to neglect evidence that climate change is man-made, because they think that industries would need to be constrained to slow down global warming (Kahan, 2010). Yet if these people are made aware that nuclear power is a possible response to climate change, they are more likely to accept evidence favoring the position that humans are contributing to global warming (Kahan, 2010). In addition, we are more open to process disconfirming opinions if experts from our own cultural community are endorsing them (Kahan, 2010). Thus, scientific information should be delivered by culturally diverse communicators.
Our mindsets also determine how biased we are toward political information. For instance, people who are curious about science are more likely to consider data that contrasts with their political predispositions (Kahan, Landrum, Carpenter, Helft, Jamieson, 2017). Similarly, having a cooperative attitude makes us open-minded, too (Schwind & Buder, 2012). In contrast, when a debate turns into a competition, opposing parties insist on their opinions. Therefore, we need to acknowledge that we are all working toward a common goal: The security, economic well-being, and health of our society (Kahan, 2010).
Certainly, we need to find more ways to enable an open-minded and unbiased consideration of data that encompass all stakeholders, including the media. More insights from the social sciences will help us to move “from post-truth to post-lies” (Tsipursky, 2017).
Relevant Publications and References
Braman, D., Kahan, D. M., Peters, E., Wittlin, M., Slovic, P. (2012). The Polarizing Impact of Science Literacy and Numeracy on Perceived Climate Change Risks. Nature Climate Change, 2, 732-736.
Cook, J., Oreskes N., Doran, P. T., Anderegg, W. R. L., Verheggen, B., W Maibach, E., Carlton, J.S., Lewandowsky, S., Skuce, A. G., & Green, S. A. (2016). Consensus on consensus: a synthesis of consensus estimates on human-caused global warming. Environmental Research Letters, 11, 4. doi: 10.1088/1748-9326/11/4/048002
Cohen, G. L., Aronson J., & Steele, C. M. (2000).When Beliefs Yield to Evidence: Reducing Biased Evaluation by Affirming the Self. Personality and Social Psychology Bulletin, 26, 1151. doi: 10.1177/01461672002611011
Feldman, D. B. (2017). How to Talk About Politics in a Post-Truth World. Psychology Today. Retrieved 23 Jul. 2017 from: https://www.psychologytoday.com/blog/supersurvivors/201703/how-talk-about-politics-in-post-truth-world
Hameiri, B., Porat, R., Bar-Tal, D., Bieler, A., & Halperin, E. (2014). Paradoxical thinking as a new avenue of intervention to promote peace. PNAS, 111, 30.
Kahan, D. M. (2010). Fixing the communications failure. Nature, 463, pp.296.
Kahan, D. M. (2013). Ideology, motivated reasoning, and cognitive reflection. Judgment and Decision Making, 8, 4, pp. 407–424.
Kahan, D. M., Jenkins-Smith, H., & Braman, D. (2010). Cultural cognition of scientific consensus. Journal of Risk Research 14, 2, pp. 147–174.
Kahan, D. M., Landrum, A., Carpenter, K., Helft, L., Jamieson, K. H. (2017). Science Curiosity and Political Information Processing. Advances in Political Psychology, 38, 1. doi: 10.1111/pops.12396
Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2, 732–735. doi: 10.1038/NCLIMATE1547
Kahan. D. M. (2013). A Risky Science Communication Environment for Vaccines. Science, 342 (6154), 53-54. doi: 10.1126/science.1245724
Kirk, A. (2017). EU referendum: The claims that won it for Brexit, fact checked. The Telegraph. Retrieved 20 May 2017, from http://www.telegraph.co.uk/news/0/eu-referendum-claims-won-brexit-fact-checked/
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence. Journal of Personality and Social Psychology, 37, 11, 2098-2109.
Oxford Dictionaries | English. (2017). Word of the Year 2016 is… | Oxford Dictionaries. Retrieved 9 Mar. 2017, from: https://en.oxforddictionaries.com/word-of-the-year/word-of-the-year-2016
Ross, L., Lepper, M. R., Strack, F., & Steinmetz, J. (1977). Social explanation and social expectation: Effects of real and hypothetical explanations on subjective likelihood. Journal of Personality and Social Psychology, 35, 817–829.
Schwind C., & Büder, J. (2012). Reducing confirmation bias and evaluation bias: When are preference-inconsistent recommendations effective – and when not? Computers in Human Behavior, 28, 2280–2290
Tsipursky, G. (2017). From Post-Truth to Post-Facts. Psychology Today. Retrieved 9 Mar. 2017, from https://www.psychologytoday.com/blog/intentional-insights/201703/post-truth-post-lies
Note: Image by Jim Gill, licensed under CC BY-NC-SA 2.0