F Rosa Rubicondior: Fake News - Why People Believe False Stories and Disinformation

Saturday 25 June 2022

Fake News - Why People Believe False Stories and Disinformation

Why We Fall for Disinformation | Psychology Today

In a report published recently by the US Center for Naval Analysis (CNA) a team of psychologists analysed the reasons why so many people are falling for disinformation. The problem us due to the way we have evolved to deal with information and the fact that this has ill-prepared us to deal with the vast amount of information now being directed at us by modern technology.

In fact, some of the time-tested tools make us dangerously vulnerable to disinformation, especially disinformation designed to mislead and garner support for extremist groups for whom the truth would be toxic. We see this today in the form of disinformation about, for example, COVID-19, the measures to reduce its spread and the vaccines designed to protect us from it. We also see it in relation to politics, political movements and parties, international affairs, religious fundamentalism and anti-science propaganda, such as climate change and evolution, and especially conspiracy theories such as those promulgated by QAnon and former President, Donald Trump's supporters, intended to radicalise, undermine confidence in institutions, and garner support for extreme solutions to non-existent problems.

In other words, disinformation campaigns are designed to benefit those whom the report calls 'malign actors', for whom the truth would be dangerous and who know they need their target marks to believe falsehoods and mistrust the evidence.

In the abstract to their report, the psychologists, Heather Wolters, Kasey Stricklin, Neil Carey, and Megan K. McBride, say:

Abstract

Disinformation is a growing national security concern with a primary impact occurring in the mind. This study explores the underlying psychological principles facilitating the absorption and spread of disinformation and outlines options for countering disinformation grounded in in cognitive and social psychological literature. The report serves as an introduction to the topic for policy- and defense-decision makers with no prior background in psychology or disinformation. It first examines what disinformation is and why it is important before turning to the four key psychological principles relevant to this topic. A key finding is that the principles themselves are neutral and normal cognitive processes that malign actors can exploit to achieve their own objectives.
They go on to say:
Today, messages of persuasion are not just on billboards and commercials, but in a host of non-traditional places like in the memes, images and content shared online by friends and family. When viewing an Oreo commercial, we can feel relatively confident that it wants to persuade us of the cookie’s excellence and that the creator is likely Nabisco. The goals of today’s disinformation campaigns are more difficult to discern, and the content creators harder to identify. Few viewers will have any idea of the goal or identify of the creator of a shared meme about COVID-19 vaccines. And since this content appears in less traditional locations, we are less alert to its persuasive elements.
They identify four key mechanism that make people vulnerable to disinformation:
  • Initial information processing: Our mental processing capacity is limited; we simply cannot deeply attend to all new information we encounter. To manage this problem, our brains take mental shortcuts to incorporate new information. For example, an Iranian-orchestrated disinformation campaign known as Endless Mayfly took advantage of this mental shortcut by creating a series of websites designed to impersonate legitimate and familiar news organizations like The Guardian and Bloomberg News. These look-alike sites were subject to less scrutiny by individual users who saw the familiar logo and assumed that the content was reliable and accurate.
  • Cognitive dissonance: We feel uncomfortable when confronted with two competing ideas, experiencing what psychologists call cognitive dissonance. We are motivated to reduce the dissonance by changing our attitude, ignoring or discounting the contradictory information, or increasing the importance of compatible information. Disinformation spread by the Chinese government following the 2019 protests in Hong Kong took advantage of the human desire to avoid cognitive dissonance by offering citizens a clear and consistent narrative casting the Chinese government in a positive light and depicting Hong Kong’s protestors as terrorists. This narrative, shared via official and unofficial media, protected viewers from feeling the dissonance that might result from trying to reconcile the tensions between the Chinese government’s position and that of the Hong Kong protestors.
  • Influence of group membership, beliefs, and novelty (the GBN model): Not all information is equally valuable to individuals. We are more likely to share information from and with people we consider members of our group, when we believe that it is true, and when the information is novel or urgent. For example, the #CoronaJihad hashtag campaign leveraged the emergence of a brand new disease — one that resulted in global fear and apprehension — to circulate disinformation blaming Indian Muslims for the its [sic] origins and spread.
  • Emotion and arousal: Not all information affects us the same way. Research demonstrates that we pay more attention to information that creates intense emotions or arouses us to act. That means we are more likely to share information if we feel awe, amusement or anxiety than if we feel less-arousing emotions like sadness or contentment. Operation Secondary Infektion, coordinated by the Russians, tried to create discord in Russian adversaries like the U.K. by planting fake news, forged documents and divisive content on topics likely to create intense emotional responses, such as terrorist threats and inflammatory political issues.
The authors go on to recommend certain actions to safeguard against falling for disinformation, most of which will be readily familiar to sceptics and those familiar with the scientific method, which is designed to catch and filter out false information and false conclusions by looking dispassionately at the evidence and allowing opinions to flow from that evidence.

The actions they recommend, details of which can be read in the report (pages 30-34) are:
  • Bolstering your resistance to disinformation
  • Bolstering resistance to disinformation in others

Most people who have spent any time debating online with creationists and conspiracy theorists will be familiar with the role of cognitive dissonance in why people will often believe things that are demonstrably untrue whilst rejecting opposing views that are based on sound, demonstrable evidence. The authors describe the role of cognitive dissonance in the spread and acceptance of fake news and disinformation, with:
Cognitive dissonance theory

Cognitive dissonance happens when a person is confronted with two competing thoughts. For example, a person might simultaneously think the following: Exercise is good for my body; when I exercise, it hurts. It is uncomfortable to hold two competing ideas/beliefs at one time. Therefore, people are motivated to reduce the conflict or remove the dissonance. Dissonance theory describes how people are influenced to either accept or reject beliefs, as well as the information/arguments that accompany those beliefs. The theory includes both cognitive and emotional components. It posits that people feel uncomfortable when they have to reconcile conflicting information. Conflicting information is dissonant, whereas nonconflicting information is consonant, consistent, or compatible…

When information is incompatible with our beliefs, we react in one of four ways: (1) adding new, consonant cognitions, (2) removing the inconsistent information, (3) reducing the importance of opposing information, or (4) increasing the importance of compatible cognitions.80 Festinger’s classic example was of smokers encountering information indicating that smoking was bad for their health. In this case, the smoker has four options:
  1. Change behavior or adopt new attitude (e.g., stop smoking) (adding new, consonant cognitions).
  2. Continue to believe that smoking is not bad for health (remove the incompatible information).
  3. Compare risk from smoking to risk from something worse, such as auto accidents (reducing the importance of opposing information).
  4. Think about the enjoyment of smoking and its good effects (increase the importance of compatible information and that it might assist with weight control).
The reason conspiracy theorists, creationists, flat earthers, climate change deniers and people who think Donald Trump was a good president, expend so much effort building cults and recruiting new cult members can be found in the way group membership aids in the spread and acceptance of fake news and disinformation. The authors explain this with:
The Group, Belief, Novelty (GBN) model

Not all information is equally valuable to an individual. Some information (or disinformation) resonates with some people more than others. The Group, Belief, Novelty (GBN) model helps explain the likelihood that someone will pass information (rumors specifically) on to others.91 The theory posits that we accept and share information more readily when it comes from people we know, it appeals to what “our group” believes, and when we think it is new.

Because the theory is built primarily related to rumors (a subset of information), we briefly describe rumors and how they apply to this study on disinformation. Rumors are “unverified and instrumentally relevant information statements in circulation that arise in context of ambiguity, danger, or potential threat and that function to help people make sense of and manage risk.”92 While this can include conspiracy theories, they are outside the scope of this paper. For our purposes, all conspiracy theories are rumors but not all rumors are conspiracy theories. In this section, we focus solely on rumors that are not conspiracy theories. Researchers who developed the model cite empirical evidence that rumors are important to study because they alter purchase behaviors,93 “spark” riots in conflict situations,94 and influence stock market buying and selling.95 In addition, they reference research findings that rumors affect attitudes.96

The GBN model is a “two step agent-based mathematical model of negative rumor spread in the context of conflicting groups”97 that uses concepts and findings from psychology and sociology research as the basis for its equations. According to the first step, the probability of rumor transmission between two people is based on the following three factors:
  1. Group memberships (G) of the receiver and transmitter
  2. Strength of their belief (B) in the rumor
  3. The perceived novelty (N) of the rumor
The second step models how belief (B) levels and the perceived rumor novelty (N) of participants change over time, using findings from the literature on attitude change. The first factor (G) encompasses several findings regarding how group membership affects the sharing of rumors. In the case of negative (derogatory) rumors, people share them with their in-group (people with whom they identify with, whether age, race, and gender, occupation, or political leaning). They rarely share rumors with the out-group (people with whom they feel little affiliation). It is not surprising that the rumor target (whether it attacks individuals in the in-group or the out-group) and valence (whether it praises or derogates) affect whether a rumor is shared and spreads.

Rumors that derogate the out-group are called “wedge-driving” rumors because they attempt to drive a wedge between groups. They can also be used for self-enhancing motives (boosting one’s self-esteem and those of the in-group)98 or the desire to increase liking between the spreader and the hearer. Conversely, people rarely share derogatory rumors about the outgroup with people from that out-group.

[…]

Implications of the GBN Model for disinformation

The GBN model posits that group identity, degree of belief in the information, and information’s novelty are powerful factors in whether information gets shared, with whom we share it, and when we share it. Once disinformation gets shared, it can be amplified and further distorted by those who find the information compelling (as in the old “telephone game” people sometimes play at group gatherings, the original story almost invariably gets distorted in the retelling). These factors coincide with characteristics of the current information environment, where people can stay connected with their smart phones 24/7, allowing people to feel that (1) they are never alone, (2) their voice will always be heard, and (3) they can put their attention anywhere they want to put it.104 In a sense, we live in a 24/7 virtual telephone game. Although these characteristics of the information environment can speed up rumor spread, the GBN model provides areas where the telephone game can be interrupted, slowed, or corrected.
False information then can be accepted due to a combination of cognitive dissonance in the belief and acceptance require far less intellectual effort than does scepticism and fact-checking, especially if the false information is consistent with prior beliefs (the rabbit hole effect, where a person’s view of reality is conditioned by the deliberate exclusion of unwanted information. It is then reinforced by group affiliation, in other words, by formal or informal membership of a cult of like-minded people. The fact that other members of the group also believe the same false information makes it more likely that it will be believed and passed on, especially if the group is centred around believing that false information in the first place. We see this with QAnon, Trumpanzee and creationist cults, for example, where disbelief, or even expressing doubt, can be met with ostracism, online abuse and threats and kudos is gained by making the retelling more lurid and sensational.

And we see this in religious fundamentalism, where plainly irrational and evidence-free beliefs are tenaciously held onto and defended for no better reason than that they are consistent with previously held beliefs inherited from the prevailing culture, because friends and family also believe them and because doubt, dissent and disbelief might well have dire social consequences if not actual physical ones.

Thank you for sharing!









submit to reddit

1 comment :

  1. Before we can stop believing in and living by false information and begin reasoning correctly and continue reasoning correctly then we must first be made fully aware and remain fully aware of the primary cause of why we are naturally prone to lie and deceive, even when there is plenty to go round, and not only deceive others but ourselves also whenever it suits us. Only then may the cure become clear and only then may we escape from the ranks of the insane.

    The glaring insanity now so clearly ruling over the human race shows no sign of abating as the deceived and deceiving powers that be (supported by the deceived and deceiving gullible masses) continue fomenting disasters (economic as well as military) in order to continue fighting their proxy wars for world domination.

    ReplyDelete

Obscene, threatening or obnoxious messages, preaching, abuse and spam will be removed, as will anything by known Internet trolls and stalkers, by known sock-puppet accounts and anything not connected with the post,

A claim made without evidence can be dismissed without evidence. Remember: your opinion is not an established fact unless corroborated.

Web Analytics