20 June 2024
Boosting Trust in Science: Bridging the Divide

All images are AI generated

Spread the love

Amidst the pandemic’s challenges, a striking 33% of UK citizens have found their trust in science bolstered, our latest findings reveal. However, an intriguing counterpoint emerges with 7% expressing a drop in scientific faith. Unraveling the complexities behind this trust spectrum could be key to enhancing public confidence in scientific authority.

Why some people don’t trust science—and how to change their minds

Related Video

Published on: January 5, 2022 Description: The coronavirus pandemic has put scientists and their work in a public spotlight unlike anything seen in decades. Our researchers ...
In the age of COVID-19, do Americans trust science?

During the pandemic, a third of people in the UK reported that their trust in science had increased, we recently discovered. But 7% said that it had decreased. Why is there such variety of responses?

The deficit of knowledge and trust

For many years, it was thought that the main reason some people rejected science was a simple deficit of knowledge and a mooted fear of the unknown. Consistent with this, many surveys reported that attitudes to science are more positive among those people who know more about textbook science. But if that were indeed the core problem, the remedy would be simple: inform people about the facts. This strategy, which dominated science communication through much of the later part of the 20th century, has, however, failed at multiple levels.

The problem of confirmation bias

In controlled experiments, giving people scientific information was found not to change attitudes. And in the UK, scientific messaging over genetically modified technologies has even backfired. The failure of the information-led strategy may be down to people discounting or avoiding information if it contradicts their beliefs—also known as confirmation bias. However, a second problem is that some trust neither the message nor the messenger. This means that distrust in science isn’t necessarily just down to a deficit of knowledge, but a deficit of trust.

The role of overconfidence and conspiracy theories

Recent evidence has revealed that people who reject or distrust science are not especially well informed about it, but more importantly, they typically believe that they do understand the science. This result has, over the past five years, been found over and over in studies investigating attitudes to a plethora of scientific issues, including vaccines and GM foods. It also holds, we discovered, even when no specific technology is asked about. However, they may not apply to certain politicized sciences, such as climate change.

Recent work also found that overconfident people who dislike science tend to have a misguided belief that theirs is the common viewpoint and hence that many others agree with them. Other evidence suggests that some of those who reject science also gain psychological satisfaction by framing their alternative explanations in a manner that can’t be disproven. Such is often the nature of conspiracy theories—be it microchips in vaccines or COVID being caused by 5G radiation.

The challenge of engaging with skeptics

But the whole point of science is to examine and test theories that can be proven wrong—theories scientists call falsifiable. Conspiracy theorists, on the other hand, often reject information that doesn’t align with their preferred explanation by, as a last resort, questioning instead the motives of the messenger. When a person who trusts the scientific method debates with someone who doesn’t, they are essentially playing by different rules of engagement. This means it is hard to convince skeptics that they might be wrong.

The importance of the messenger and the consensus position

So what can we do with this new understanding of attitudes to science? The messenger is every bit as important as the message. Our work confirms many prior surveys showing that politicians, for example, aren’t trusted to communicate science, whereas university professors are. This should be kept in mind.

The fact that some people hold negative attitudes reinforced by a misguided belief that many others agree with them suggests a further potential strategy: tell people what the consensus position is. The advertising industry got there first. Statements such as “eight out of ten cat owners say their pet prefers this brand of cat food” are popular. A recent meta-analysis of 43 studies investigating this strategy (these were “randomized control trials”—the gold standard in scientific testing) found support for this approach to alter belief in scientific facts. In specifying the consensus position, it implicitly clarifies what is misinformation or unsupported ideas, meaning it would also address the problem that half of people don’t know what is true owing to the circulation of conflicting evidence.

Prebunking and addressing uncertainty

A complementary approach is to prepare people for the possibility of misinformation. Misinformation spreads fast, and, unfortunately, each attempt to debunk it acts to bring the misinformation more into view. Scientists call this the “continued influence effect”. Genies never get put back into bottles. Better is to anticipate objections, or inoculate people against the strategies used to promote misinformation. This is called “prebunking”, as opposed to debunking.

Different strategies may be needed in different contexts, though whether the science in question is established with a consensus among experts, such as climate change, or cutting-edge new research into the unknown, such as for a completely new virus, matters. For the latter, explaining what we know, what we don’t know, and what we are doing—and emphasizing that results are provisional—is a good way to go. By emphasizing uncertainty in fast-changing fields, we can prebunk the objection that a sender of a message cannot be trusted as they said one thing one day and something else later.

Reaching out to the disengaged

But no strategy is likely to be 100% effective. We found that even with widely debated PCR tests for COVID, 30% of the public said they hadn’t heard of PCR. A common quandary for much science communication may, in fact, be that it appeals to those already engaged with science, which may be why you read this. That said, the new science of communication suggests it is certainly worth trying to reach out to those who are disengaged.

SOURCE: Why some people don’t trust science—and how to change their minds



1. Why is there such a variety of responses in people’s trust in science?

The variety of responses in people’s trust in science can be attributed to several factors, including a deficit of knowledge, confirmation bias, distrust in the message and the messenger, overconfidence, and the influence of conspiracy theories.

2. Why does giving people scientific information not change their attitudes?

Providing people with scientific information does not always change their attitudes because they may discount or avoid information that contradicts their beliefs, a cognitive bias known as confirmation bias. Additionally, some individuals may not trust the message or the messenger, leading to a deficit of trust in science.

3. Why do some people reject or distrust science despite having a limited understanding of it?

Studies have found that individuals who reject or distrust science often believe that they understand the science, even if their knowledge is limited. This overconfidence in their understanding may contribute to their negative attitudes towards scientific issues.

4. How can we engage with skeptics and convince them that they might be wrong?

Engaging with skeptics who reject the scientific method can be challenging because they often question the motives of the messenger instead of considering the evidence. It is important to recognize that skeptics and proponents of science may be playing by different rules of engagement. Finding common ground and presenting evidence in a clear and unbiased manner may help in convincing skeptics to reconsider their views.

5. Why is the messenger as important as the message in science communication?

Research suggests that the messenger plays a significant role in science communication. Politicians are often not trusted to communicate science, while university professors are viewed as more trustworthy. When communicating scientific information, it is important to consider the credibility and expertise of the messenger to enhance trust and acceptance of the message.

Related Wikipedia Articles

Topics: confirmation bias, trust in science, conspiracy theories

Confirmation bias
Confirmation bias (also confirmatory bias, myside bias, or congeniality bias) is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when...
Read more: Confirmation bias

Trust (social science)
Trust is the belief that another person will do what is expected. It brings with it a willingness for one party (the trustor) to become vulnerable to another party (the trustee), on the presumption that the trustee will act in ways that benefit the trustor. In addition, the trustor does...
Read more: Trust (social science)

Conspiracy theory
A conspiracy theory is an explanation for an event or situation that asserts the existence of a conspiracy by powerful and sinister groups, often political in motivation, when other explanations are more probable. The term generally has a negative connotation, implying that the appeal of a conspiracy theory is based...
Read more: Conspiracy theory

Leave a Reply

Your email address will not be published. Required fields are marked *