Saturday 20 August 2022

Why Scientific Evidence Doesn't Change a Fundamentalist's or Conspiracist's Mind

Fig. 2. Belief networks and development of interdependence over measurements.
The networks are shown for GM food (A) and childhood vaccines (B) and include moral beliefs (orange nodes) and social beliefs (green nodes). The ties represent the partial correlations between two beliefs controlled for all other beliefs. Blue (red) ties represent positive (negative) correlations, and the widths of the ties correspond to the strength of the correlations. The strength of the ties ranged from 0.02 (between the beliefs “Chi” and “Fam”) to 0.30 (between the beliefs “Med” and “Sci”) for GM food and from 0.02 (between the beliefs “Com” and “Jou”) to 0.28 (between the beliefs “OnE” and “OnC”), N = 979.
Study: new model for predicting belief change | Santa Fe Institute

Two reserchers at the Santa Fe Institute, Santa Fe, New Mexico, USA, postdoctoral Fellows Jonas Dalege and Tamara van der Does, have developed a model to predict whether a person is likely to change his/her beliefs when presented with evidence-based information.

Those who have ever tried debating in the social media with Creationists, Antivaxxers, QAnon cultists or people who believe Donald Trump won the 2020 presidential election, will be aware that people with these counter-factual beliefs are almost impossible to shift from those positions, no matter how strong the evidence presented to them.

The problem is our old friend, cognitive dissonance. Briefly, cognitive dissonance is the conflict or dissonance that is generated when firmly held belief meets contrary evidence. The result is emotional discomfort, sometimes amounting to a perceived threat, which needs to be resolved one way or another.

Cognitive Dissonance
In the field of psychology, cognitive dissonance is the perception of contradictory information. Relevant items of information include a person's actions, feelings, ideas, beliefs, values, and things in the environment. Cognitive dissonance is typically experienced as psychological stress when persons participate in an action that goes against one or more of those things.[1] According to this theory, when two actions or ideas are not psychologically consistent with each other, people do all in their power to change them until they become consistent.[1][2] The discomfort is triggered by the person's belief clashing with new information perceived, wherein the individual tries to find a way to resolve the contradiction to reduce their discomfort.[1][2][3]

Of course, the intellectually honest thing to do would be to evaluate the evidence and, if it warrants it, change one's mind. However, many fundamentalists, Creationists and conspiracy theorists have invested too much in their beliefs to change their minds that easily. Their belief may have become part of their identity. If they had adopted a strategy of intellectual honesty and had the humility to allow the evidence to lead their beliefs, no matter where it led, in other words, if their opinions had been evidence-based, they would not be fundamentalists, Creationists, etc in the first place.

To test out the likelihood that such people could be persuaded to change their minds when presented with evidence that contradicts their beliefs, the Santa Fe researcher built on previous efforts by integrating both moral and social beliefs into a statistical physics framework of 20 interacting beliefs.

And what they found was rather surprising, although easy to understand. Those with the most dissonance are most likely to revise their opinions, but not always in the direction the facts point them. For example, if they have been convinced that scientists are dishonest or part of some vast conspiracy, the contrary evidence can be dismissed as fake or intended to mislead, so their counter-factual opinion can be strengthened. After all, why would the scientists be trying so hard to trick them into changing their minds, if they really are wrong? As the Santa Fe Institute news release explains:
[The researchers] then used this cognitive network model to predict how the beliefs of a group of nearly 1,000 people, who were at least somewhat skeptical about the efficacy of genetically modified foods and childhood vaccines, would change as the result of an educational intervention.

For example, if you believe that scientists are inherently trustworthy, but your family and friends tell you that vaccines are unsafe, this is going to create some dissonance in your mind. We found that if you were already kind of anti-GM foods or vaccines to begin with, you would just move more towards that direction when presented with new information even if that wasn’t the intention of the intervention.

Dr. Tamara van der Does, co-author
Santa Fe Institute
Santa Fe, NM, USA.

On the one hand you might want to target people who have some dissonance in their beliefs, but at the same time this also creates some danger that they will reduce their dissonance in a way that you didn’t want them to. Moving forward, we want to expand this research to see if we can learn more about why people take certain paths to reduce their dissonance.

Dr. Jonas Dalege, co-author
Santa Fe Institute
Santa Fe, NM, USA.
Study participants were shown a message about the scientific consensus on genetic modification and vaccines. Those who began the study with a lot of dissonance in their interwoven network of beliefs were more likely to change their beliefs after viewing the messaging, but not necessarily in accordance with the message. On the other hand, people with little dissonance showed little change following the intervention.

While still in an early stage, the research could ultimately have important implications for communicating scientific, evidence-based information to the public.
Copyright: © 2022 The authors.
Published by American Association for the Advancement of Science. Open access. (CC BY 4.0)
The authors give more details in the abstract to their published paper:

Skepticism toward childhood vaccines and genetically modified food has grown despite scientific evidence of their safety. Beliefs about scientific issues are difficult to change because they are entrenched within many interrelated moral concerns and beliefs about what others think. We propose a cognitive network model that estimates network ties between all interrelated beliefs to calculate the overall dissonance and interdependence. Using a probabilistic nationally representative longitudinal study, we test whether our model can be used to predict belief change and find support for our model’s predictions: High network dissonance predicts subsequent belief change, and people are driven toward lower network dissonance. We show the advantages of measuring dissonance using the belief network structure compared to traditional measures. This study is the first to combine a unifying predictive model with an experimental intervention and to shed light on the dynamics of dissonance reduction leading to belief change.

It seems the priority for people like Creationists, fundamentalists, antivaxxers, climate-change deniers and conspiracy theorists, who hold counter-factual beliefs, is not so much being right as reducing the cognitive dissonance caused by their beliefs running counter to the evidence. Their main concern is to find a way to ignore the evidence rather than to accept it and allow it to modify beliefs.
When the dogma is sacred, truth must be ignored.

Thank you for sharing!

submit to reddit


  1. We still do not have good strategies for changing the inflexible mind. Perhaps we should be trying to persuade

  2. Unfortunately we have, as yet, developed no good strategies for changing the inflexible mind. Perhaps we should try to persuade people to review and improve their original sources of information (convictions) rather than debate the facts? Instead of trying to replace one inflexible conviction with logic or evidence, replace it with a slightly less extreme conviction.


Obscene, threatening or obnoxious messages, preaching, abuse and spam will be removed, as will anything by known Internet trolls and stalkers, by known sock-puppet accounts and anything not connected with the post,

A claim made without evidence can be dismissed without evidence. Remember: your opinion is not an established fact unless corroborated.

Web Analytics