Pages

Monday, 11 September 2023

Conspiracy Loon News - Why Some People Fall For Wackadoodle Conspiracy Theories


They fall more easily for conspiracy theories - Linköping University

In the last 20 years of so, two things have featured in western culture, especially so in the United States, and recent research has shown how these are linked.
  1. Democratization of opinion: Conflation of the belief that everyone is entitled to their opinion on every subject under the sun, as guaranteed to Americans by their constitution, and the belief that this means every opinion should carry equal weight in a debate, regardless of the evidence (or lack of it) on which it is based, or the level of expertise in the subject of the person voicing that opinion.

    The attractiveness of this belief to the intellectually lazy and to those who feel alienated by the political and economic forces that shape their lives, is that they can tell themselves that they are at least the equal of the experts, and very probably their better.

    For example, I was recently castigated in the social media when I disagreed with the claim that "everyone needs Jesus because without Him life is meaningless". I was informed that "This is America (it was actually Facebook!) where we are entitled to our opinions!", as though a constitutional right in one country mandates the rest of the world to respect dogma and regard it as a statement of irrefutable truth, with the implication that no-one has a right to disagree. The constitution guarantees my right to my opinion (but not your right to yours).

    Similarly false claims are made by creationists daily in social media, accompanied by indignation when challenged. Creationists who couldn't define the terms 'evolution' or 'kind' and who assiduously maintain their scientific ignorance, will confidently inform the world that the millions of highly qualified working biomedical scientists who have no difficulty with the science, have it all wrong, and should listen to the creationist who knows best, having completed a 15 minute Google University degree in creationism. And they're all part of a gigantic Satanic conspiracy anyway.
  2. Growth in conspiracy theories: A belief that usually boils down to the paranoia that the world is being run by secretive (often Jewish or Satanic, or more recently, paedophile) organizations seeking to "take away our freedoms".

    These played, and are still playing, a major part in American politics, and were behind firstly the election of Donald Trump as President (to "save America" from foreigners and the "political elite" who run the "Deep State"). And latterly the belief that Joe Biden and the Democrats, in league with "Deep State Satanic paedophiles [add other hate figures to the list as necessary]", stole the 2020 election, which Trump actually lost heavily, based on nothing more substantial than Trump's false claims that he "won by a lot", for which not a shred of evidence have ever been produced.

    A particularly pernicious conspiracy theory, promulgated by the Trump Cult, was that the COVID-19 pandemic was a hoax and part of a conspiracy by Bill Gates, et al., to force people to be vaccinated with a vaccine which, according to different conspiracists, injected microchips to take control of you, or to alter your DNA to turn you gay or stop you believing in Jesus, or as an excuse to close the churches, etc., etc.

    Like creationism, the conspiracy theories swirling around, mostly in the USA, have become impossible to parody because just when you think they couldn't get more ludicrous, someone proves you wrong. For example, one nutjob recently informed her Facebook followers (since taken down) to the effect that the vaccines have ebola baked into them, programmed to be triggered by three pulses of a certain frequency via the 5G phone network, on a specific date in the near future. How on earth you activate an embedded virus with radio wave pulses remains a mystery.
The research also explains why conspiracy theorists often cling to their beliefs despite objective evidence that falsifies them, like the 19th Century Millerite cult that became convinced that Jesus was about to return to a hilltop in the USA (where else?) on a specific date, which, when it failed to happen, became evidence that the cult was right, but Jesus had just delayed the date to give them more time to convert the people. So they became the Seventh Day Adventists cult.

Often, the lack of any objective evidence is regarded as evidence of the truth of the conspiracy because "someone must have hidden it!"

First, how do we (i.e., people interested in truth) know what is true and what isn't?
How do we know that's true and what isn't?

Determining what is true and what isn't can be a complex and multifaceted process. It often depends on the context and the type of claim or statement being evaluated. Here are some common methods and considerations for assessing truth and falsehood:
  1. Empirical Evidence: Empirical evidence is information that can be observed, measured, and tested. Scientific methods rely heavily on empirical evidence to establish the truth of hypotheses and theories. Observations, experiments, and data analysis are used to support or refute claims.
  2. Logical Reasoning: Logical reasoning involves evaluating the coherence and consistency of a statement or argument. Deductive reasoning uses established premises to draw valid conclusions, while inductive reasoning generalizes from specific observations to make probabilistic conclusions.
  3. Expert Consensus: In many fields, experts with relevant knowledge and experience reach a consensus about what is considered true based on their collective expertise. For example, scientific consensus forms the basis for many established scientific facts.
  4. Peer Review: Peer review is a process where experts in a particular field evaluate the validity and quality of research or claims before they are published. This helps ensure that the information is accurate and reliable.
  5. Critical Thinking: Critical thinking involves questioning and analyzing information, especially when it lacks clear evidence or comes from unreliable sources. Evaluating the credibility of sources and examining potential biases is an important aspect of critical thinking.
  6. Reliable Sources: Trusted and credible sources, such as reputable news organizations, academic journals, and government agencies, are generally more reliable for obtaining accurate information compared to less reputable or biased sources.
  7. Fact-Checking: Fact-checking organizations and websites specialize in verifying the accuracy of claims made in the media and public discourse. They use a rigorous process to assess the truthfulness of statements.
  8. Cross-Verification: Cross-referencing information from multiple sources can help confirm its accuracy. If multiple independent sources corroborate a claim, it is more likely to be true.
  9. Skepticism: Maintaining a healthy level of skepticism means not accepting claims at face value and demanding evidence or justification before accepting something as true.
  10. Context: Understanding the context in which a statement is made is crucial. The same statement may be true in one context but false in another.
  11. Personal Experience: Personal experience can sometimes provide insight into what is true, but it should be considered alongside other forms of evidence, as individual experiences can be subjective and biased.
  12. Time and Reevaluation: The truth of a statement may change over time as new evidence emerges or as our understanding of a topic evolves. Continuously reevaluating information is essential.
It's important to note that determining truth is not always straightforward, and some claims may remain uncertain or subject to ongoing debate. Critical thinking, skepticism, and a reliance on credible sources are key tools for navigating the complex landscape of truth and falsehood. Additionally, being open to revising one's beliefs in light of new evidence is a fundamental aspect of the pursuit of truth.


The research which shows the link between the democratization of opinion and the growth in conspiracism was conducted by three researchers at the Department of Behavioural Sciences and Learning, Linköping University, Linköping, Sweden, who were investigating the link between "truth relativism" and the risk of falling victim to incorrect or fraudulent information.

Their research is published, open access, in the Journal of Research in Personality. It is explained in a Linköping University news release by Jonas Roslund:
People who primarily use their own gut feeling to determine what is true and false are more likely to believe conspiracy theories. That is the conclusion of researchers at Linköping University who have investigated the relationship between susceptibility to misleading information and the conviction that the truth is relative.

I think many people who emphasise a more relativistic view of what truth is mean well. They believe that it’s important that everyone should be able to make their voice heard. But these results show that such a view can actually be quite dangerous.

Julia Aspernäs, first author
Department of Behavioural Sciences and Learning
Linköping University, Linköping, Sweden.
In two studies reported in an article in the Journal of Research in Personality, she and two colleagues have investigated the relationship between so-called truth relativism and the risk of falling victim to incorrect or fraudulent information.

Two types of truth relativism

The first study involves approximately one thousand Swedes. In an online survey, participants were asked to answer questions about their views on what truth is. They then had to take a position on various conspiracy theories and also assess the content of a number of nonsense sentences.

The researchers also collected information on factors previously found to be related to belief in misleading information, such as the ability to reason analytically, political orientation, age, gender and educational level.

In the second study, more than 400 people from the UK participated. Here the number of questions was expanded and the participants’ degree of dogmatism and willingness to adapt their perceptions when faced with new facts were also measured.

From the material, the researchers unearthed two types of truth relativism. One that comprises those who are convinced that what you personally feel to be true is true, that is to say, that truth is subjective. And one including those who believe that truth depends on which culture or group you belong to, so-called cultural relativism.

Fact-resistant and dogmatic

I got the idea when listening to debates about whether students should learn factual knowledge or be encouraged to themselves seek out what they think is true. It sounded like the debaters had completely opposite assumptions about what truth is and argued that their own approach was the best way to help students become critical thinkers. Although our study did not investigate causality, we see that truth relativism seems to be linked to a greater belief in misleading information. It may be important to keep that in mind.

Julia Aspernäs
The results clearly show that those who believe that the truth is subjective are more likely to believe conspiracy theories and to hold on to their beliefs even when faced with facts that contradict them. They also have a greater tendency to find profound messages in nonsense sentences. Even when the researchers investigated other possible explanations, such as the ability for analytical thinking or political orientation, subjectivism remained as an independent, explanatory factor.

The connections were not as clear for those who believe that truth is culture-bound and the results there point partly in different directions.

To the researchers’ surprise, the data collection from the UK also showed a link between subjectivism and dogmatism. Thus, someone who claims that the truth is personal can, paradoxically, often at the same time reject other people’s right to their own truth.

Reflected in the political debate

Julia Aspernäs thinks that the results are useful when listening to political debates, such as those concerning schooling. People may have different opinions on matters of fact, but behind this may lie a fundamental disagreement about how the world works and what even exists.
The results of the two studies are suumarised in the abstract to the paper:
Highlights

  • We find two forms of truth relativism: subjectivism and cultural relativism.
  • Subjectivism yields higher receptivity to misinformation than cultural relativism.
  • Subjectivism predicts receptivity to misinformation over and above other predictors.
  • Cultural relativism is positively related to bullshit receptivity.

Abstract

This research investigated whether belief in truth relativism yields higher receptivity to misinformation. Two studies with representative samples from Sweden (Study 1, N = 1005) and the UK (Study 2, N = 417) disentangled two forms of truth relativism: subjectivism (truth is relative to subjective intuitions) and cultural relativism (truth is relative to cultural context). In Study 1, subjectivism was more strongly associated with receptivity to pseudo-profound bullshit and conspiracy theories than cultural relativism was. In Study 2 (preregistered), subjectivism predicted higher receptivity to both forms of misinformation over and above effects of analytical and actively open-minded thinking, profoundness receptivity, ideology, and demographics; the unique effects of cultural relativism were in the opposite direction (Study 1) or non-significant (Study 2).
What strongly emerges from this study is that people who have a subjective view of truth, can easily fall for conspiracy theories and bullshit 'deepities' which on examination are not even shallow, such as those espoused by people like Deepak Chopra and other merchants of vacuous woo. They will also fall for claims that 'feel' right or appeal to ideology and prejudice.

Once fooled, they then become difficult to dissuade, even regarding contradictory evidence as irrelevant or even part of the conspiracy. Contradictory evidence can even reinforce their convictions. Even more dangerously for society, they can come to regard people with different opinions as traitors and a danger to be controlled.

So, we end up with an attempted coup d’état in the USA by nutjobs who have been fooled by conspiracy theorists playing on the fact that they feel they're not privileged enough. A coup d’état based not on objective evidence but on the conviction that, if they believe it, it must be true and if democracy didn't deliver what they think it should have delivered, then it must be dismantled and replaced by a compliant system accountable only to them.

Thank you for sharing!









submit to reddit


No comments:

Post a Comment

Obscene, threatening or obnoxious messages, preaching, abuse and spam will be removed, as will anything by known Internet trolls and stalkers, by known sock-puppet accounts and anything not connected with the post,

A claim made without evidence can be dismissed without evidence. Remember: your opinion is not an established fact unless corroborated.