top of page

Emotional, Political, Science

Updated: Jan 24, 2022

With the onset of the COVID-19 pandemic, questions came to light regarding the effectiveness of mask-wearing, the validity of vaccine efficacy, and new scientific breakthroughs. During the lockdown, many Americans started to show signs of distrust towards medical research and scientific discoveries. As more politicians continue to share both factual and opinionated information around science, science has become more and more politicized. Who is to blame for this growing distrust and skepticism? Is it the fault of political parties who have pitted themselves against each other in support or against the scientists? Or is it the fault of our own personal biases and emotions that take hold of us and shift our perspective to trust less in the science? Quite frankly, it is a little bit of both!

From a political perspective, a study from Pew Research found that “more Democrats (43%) than Republicans (27%) have ‘a great deal’ of confidence in scientists” [1]. This brings into view differences in how political parties view science. However, Pew Research also found that 55% of Republicans justify their distrust by arguing that scientists are just as susceptible to bias as other people are [1]. This justification begins to explore and question the role of response bias in scientific discovery and public display. When scientists are asked to be involved in policy decisions, can they separate their own bias from the science?

Politicians are increasingly depending on scientific evidence to support their arguments: facts become tied to a particular political view. However, there’s a difference between parties here too. A study from Dr. Daron Shaw found that while Republicans were, on average, less willing to defer to scientists on science policy issues, they still deferred to them on 14 out 16 policy areas studied. The only exceptions were gay marriage and mandatory health insurance [2]. From these two studies, though there is a difference between political perspectives which helps to set a foundation for understanding why there’s a distrust in science, it doesn’t seem to elucidate much of the picture for us. However, personal emotions and biases may play a larger role.

Trust is a largely philosophical concept, requiring a certain level of emotional connection. When you trust in someone, we’re often in vulnerable places. Dr. Katherine Furman places great emphasis on understanding and addressing the emotional vulnerability of when we trust in science: “we should pay closer attention to emotions when trying to understand distrust in science” [3]. Furman argues that people feel vulnerable and hence emotionally charged due to power asymmetries (a medical professional is in a position and power and control over your health) and cost asymmetries (your health is the cost, not the medical professional’s health). These emotional vulnerabilities lead to biases due to a redirection of attention: angry emotions may lead to a focus on blame, fearful emotions may lead to a focus on risk [4]. Knowing this, consider how often medical doctors make mistakes: there’s an estimated 98,000 who die a year from medical errors. Think about if you knew someone who died because of a simple mistake in the hospital, would you still maintain the same level of trust? This exemplifies the asymmetry in cost, where you had a personal relationship with the victim, but the doctor didn’t; the asymmetry in power is also shown, as you and the victim trusted the doctor with saving them.

One might argue “weren’t we talking about science in general? What about peer-reviewed research?” Though it is true that on an individual level, we aren’t necessarily personally affected by scientific research, failures in the peer-review process erode public trust. While only three out of every 10,000 papers get retracted, some of these retracted articles have already made their negative impacts on the public [5]. One such study was by Dr. Andrew Wakefield in The Lancet which suggested that autism in children was caused by vaccines for measles, mumps, and rubella. Published in 1998 and retracted in 2010, this study resulted in a sharp decline in vaccination rates and gave opportunities for anti-vaccine groups to thrive [6]. Despite the low rate of these retractions, those relevant enough to hit the public space certainly leave their mark.

Emotional and political influences affect our daily lives, and they can contribute to our logical decision-making. It is important to recognize these and further understand how they are making an impact on public opinions towards science. Next time you talk about science, do you get emotional about it? Do you get political about it? Notice it, and see the impact of how you talk about science. Acknowledge that you might present emotional, political, science once in a while.


  1. Funk, C., Hefferon, M., Kennedy, B., & Johnson, C. (2019, August 2). Trust and Mistrust in Americans’ Views of Scientific Experts. Pew Research Center Science & Society; Pew Research Center Science & Society.

  2. Politics, science, and public attitudes: What we’re learning, and why it matters. (2021).

  3. Furman, K. (2020). Emotions and Distrust in Science. International Journal of Philosophical Studies, 28(5), 713–730.

  4. Bardon, A. (2019). The Truth About Denial. Oxford University Press.

  5. Correcting the Scientific Record: Retraction Practices in Chemistry and Materials Science. (2019). Chemistry of Materials.

  6. Retracted Scientific Studies: A Growing List (Published 2015). (2021). The New York Times.

60 views0 comments


bottom of page