If you have counted on your human logic or cognitive faculties, know the psychology of misinformation: the mental shortcuts, confusions, and illusions that encourage us to believe things that aren’t true — can tell us a lot about how to prevent its harmful effects. Our psychology is what affects whether corrections work, what we should teach in media literacy courses, and why we’re vulnerable to misinformation in the first place. It’s also a fascinating insight into the human brain.
Though psychological concepts originate in academia, many have found their way into everyday language. Cognitive dissonance, first described in 1957, is one; confirmation bias is another. And this is part of the problem. Just as we have armchair epidemiologists, we can easily become armchair cognitive scientists, and mischaracterization of these concepts can create new forms of misinformation.
If reporters, fact checkers, researchers, technologists, and influencers working with misinformation (which, let’s face it, is almost all of them) don’t understand these distinctions, it isn’t simply a case of mistaking an obscure academic term. It risks becoming part of the problem.
Motivated reasoning
Convincing yourself. People who use their reasoning skills to believe what they want to believe, rather than determine the truth. The crucial point here is the idea that people’s rational faculties, rather than lazy or irrational thinking, can cause misinformed belief.
People simply aren’t being analytical enough when they encounter information.
“One group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization. The other group — to which the two of us belong — claims that the problem is that we often fail to exercise our critical faculties: that is, we’re mentally lazy.”
Lazy thinking, not motivated reasoning, is the key factor in our psychological vulnerability to misinformation.
Pluralistic ignorance
A lack of understanding about what others in society think and believe. This can make people incorrectly think others are in a majority when it comes to a political view, for example, when it is in fact a view held by very few people. This can be made worse by rebuttals of misinformation (e.g., conspiracy theories), as they can make those views seem more popular than they really are.
A variant of this is the false consensus effect: when people overestimate how many other people share their views.
Bullshit receptivity
It’s about how receptive you are to information that has little interest in the truth; a meaningless cliche, for example. Bullshit is different from a lie, which intentionally contradicts the truth.
Bullshit receptivity test examined susceptibility to false news headlines. They found that the more likely we are to accept a pseudo-profound sentence (i.e., bullshit) such as, “Hidden meaning transforms unparalleled abstract beauty,” the more susceptible we are to false news headlines.
This provides evidence for Pennycook and Rand’s broader theory that susceptibility to false news comes from insufficient analytical thinking, rather than motivated reasoning. In other words, we’re too stuck in automatic System 1 thinking, and not enough in analytic System 2 thinking.
Third-person effect
People tend to assume misinformation affects other people more than themselves.
It was recently found that there is a significant third-person effect in people’s perceived ability to spot misinformation: People rate themselves as better at identifying misinformation than others. This means people can underestimate their vulnerability, and don’t take appropriate actions.
Dual process theory
We have two basic ways of thinking: System 1, an automatic process that requires little effort; and System 2, an analytical process that requires more effort. Because we are cognitive misers, we generally will use System 1 thinking (the easy one) when we think we can get away with it.
Automatic processing creates the risk of misinformation for two reasons. First, the easier something is to process, the more likely we are to think it’s true, so quick, easy judgments often feel right even when they aren’t. Second, its efficiency can miss details — sometimes crucial ones. For example, you might recall something you read on the internet, but forget that it was debunked.
Confirmation bias
The tendency to believe information that confirms your existing beliefs, and to reject information that contradicts them. Disinformation actors can exploit this tendency to amplify existing beliefs.
Fluency
Easy information processing. People are more likely to believe something to be true if they can process it fluently — it feels right, and so seems true.
For that reason, repetition is so powerful: if you’ve heard it before, you process it more easily, and therefore are more likely to believe it. Repeat it multiple times, and you increase the effect. So even if you’ve heard something as a debunk, the sheer repetition of the original claim can make it more familiar, fluent, and believable.
It also means that easy-to-understand information is more believable, because it’s processed more fluently. For example, the same statement is more likely to be judged as true when it is printed in high- rather than low-color contrast … presented in a rhyming rather than non-rhyming form … or delivered in a familiar rather than unfamiliar accent … Moreover, misleading questions are less likely to be recognized as such when printed in an easy-to-read font.
Heuristics
Indicators we use to make quick judgments. We use heuristics because it’s easier than conducting complex analysis, especially on the internet where there’s a lot of information.
The problem with heuristics is that they often lead to incorrect conclusions. For example, you might rely on a ‘social endorsement heuristic’ — that someone you trust has endorsed (e.g., retweeted) a post on social media — to judge how trustworthy it is. But however much you trust that person, it’s not a completely reliable indicator and could lead you to believe something that isn’t true.
As our co-founder and US director Claire Wardle explains in our Essential Guide to Understanding Information Disorder, “On social media, the heuristics (the mental shortcuts we use to make sense of the world) are missing. Unlike in a newspaper where you understand what section of the paper you are looking at and see visual cues which show you’re in the opinion section or the cartoon section, this isn’t the case online.”
Cognitive miserliness
The psychological feature that makes us most vulnerable to misinformation is that we are ‘cognitive misers. We prefer to use simpler, easier ways of solving problems than ones requiring more thought and effort. We’ve evolved to use as little mental effort as possible.
This is part of what makes our brains so efficient: You don’t want to be thinking really hard about every single thing. But it also means we don’t put enough thought into things when we need to — for example, when thinking about whether something we see online is true.
This argument relied in the most part on First Draft News and author Shane Tommy.
CAN YOU SENSE NUTRIENTS?
White-Crowned Sparrows and deer can. These animals possess the ability to sense whether their food has or doesn’t have the nutrients their body needs. They typically crave for food containing amino acids.
These acids are typically the ones that their body can’t naturally produce.
CAN YOU SENSE YOUR WAY BACK HOME?
You know you have worked really hard for the day when you can’t find your way back home — well, at least the Worker Honeybees would agree. These species find their way back home thanks to the sense that they derive from the paramagnetic iron oxide in their abdomens. The abdomens shrink or swell subject to the outside magnetic changes of the earth’s magnetic field. Abdomens With Paramagnetic Iron Oxide.
CAN YOU DETECT DETECTION OF INFRARED RADIATION?
Jewel Beetles can. With the earth more susceptible to forest fires than ever today, we may not be living our safest times. Sure, taking precautions is one thing but imagine having the ability to actually sense the onset of a fire; the Jewel Beetles can! These tiny beings are capable of sensing a fire that is as far as 50 miles away. That’s not all, they also use put this sense to the right use — using recently scorched areas for mating.