Why do our brains respond to misinformation?
In the age of social media, we are no longer passive recipients of content delivered through one-directional sources like the television, radio or newspapers, but active audiences who co-create the message and generate the buzz. As self-styled messengers and citizen-journalists, we have unprecedented ability to accrue knowledge and influence our communities and yet, despite clearly articulated and widely available scientific evidence, our political and civic debates and deliberations are oftentimes not based on empirically investigated facts, but on our beliefs, biases and identity. Against overwhelming evidence to the contrary and great harm to society, regular citizens persist in believing and spreading falsehoods about climate change, vaccinations, gun violence, genetically modified foods and more.
Existential crisis like pandemics provide a fertile ground for false narratives, as ordinary citizens try to make sense of what to believe and disbelieve. Rumors and conspiracies spread rapidly from the dark web into social media feeds and infect mainstream discourse with misinformation on public authority action, unscientific treatments or medical advice and suspicions against governments, powerful people or organizations. Those that believe in the false narrative use the persuasive power of their messages and the ubiquity and speed of social media to promote their beliefs and further infect the community. As misinformation spreads and gathers momentum, individuals are motivated spread debunking messages in an attempt to inoculate their communities and build resistance against the lure of false information.
A complex mix of environmental, social and individual factors determine how receivers evaluate these persuasive messages. This cognitive process often produces contradictory results. For example, mounting evidence suggests that efforts to debunk misinformation can have the contrary effect of strengthening misperceptions rather than weakening them, depending on the receiver’s perspective and context [1,3,6]; sending corrective information to a die-hard anti-vaxxer, often reinforces their beliefs, instead of overriding them.
This blog helps explicate how our brains respond to misinformation and elaborates on why our response is so difficult to predict and control.
When faced with a persuasive message, a dual system thinking process is triggered [2,9], where the individual exercises rapid-response heuristic processing and/or deep systematic processing under the influence of a multitude of factors. This complex cognitive process determines the individuals’ reaction and response to messages from promoters and debunkers as shown in Figure 1.
Heuristic processing requires very little scrutiny of message content as receivers take advantage of the factors embedded within or surrounding a message (called heuristic cues) and use mental shortcuts or cues led reasoning to make rapid-response assessments. Systematic processing requires considerable analysis of the message content for receivers to generate their own cognitive response using arguments led reasoning [2,5,8]. The degree to which individuals use systematic or heuristic processing is determined by their level of motivation and ability to process the message. Systematic processing requires a high level of motivation and ability to engage in an effortful assessment of the message while heuristic processing requires a low level of motivation and ability and can become automatic as people rapidly access cognitive schemas based on past experiences or observations to make judgments . See Figure 1.
The precise mix of heuristic and systematic processing is influenced by multiple factors, sometimes working against each other, resulting in outcomes that may appear contradictory, but seem reasonable when considering the individuals internal schemas .
Early research in cognitive response to misinformation tested questions like: “Are expert sources more persuasive than non-expert sources? Is it better to present people with logical arguments or with emotional appeals? Is inducing fear more effective than inducing hope?” As studies mature, the focus is shifting to the underlying thought processes and mental architectures on how humans make decisions and a recognition that the same variable can have multiple effects, caused by different mental processes, depending on the situation. For example, a good feeling upon reading an article could strongly influence judgement formation, regardless of argument quality; or individuals with high confidence in their abilities could form different judgements, as compared to individuals with low confidence, for the same argument quality. People also have “thoughts about their thoughts”, called metacognitions that influence judgement formation. Two people might arrive at the same conclusion based on an assessment of the persuasive message, but the person with greater confidence in their thoughts will demonstrate more certainty and strength in belief formation . Local context like recent events, time to process and distractions can also influence both heuristic and systematic thinking .
Understanding why people respond to persuasive messages from promoters or debunkers is a complex undertaking.
It is not sufficient to state that education level, political ideology, age or some single attribute fosters higher susceptibility to misinformation. The human brain is far too complicated to allow for easy explanations. While only a small fraction of ‘bad actors’ intentionally spread misinformation, the large majority respond and act on it because of their internal cognitive schema and how it aligns with their motivations and abilities.
Developing effective strategies to combat misinformation requires us to understand this complex cognitive process and embed it in our research and solutions.
A version of this paper was co-authored with Ioanna Constantiou (email@example.com) and published in the European Conference on Information Systems 2020. It is available at Human Agency in the Propagation of False Information
Shama Patel is a PhD student at the Copenhagen Business School, researching individual cognition and collective agency in the belief and spread of misinformation and strategies to combat it. She can be reached at firstname.lastname@example.org or find her on LinkedIn https://www.linkedin.com/in/shamadriveschange/.
1. Boussalis, C. and Coan, T.G. Elite Polarization and Correcting Misinformation in the “Post-Truth Era.” Journal of Applied Research in Memory and Cognition, 6, (2017), 405–408.
2. Chaiken, S. Heuristic versus systematic information processing and the use of source versus message cues in persuasion. Journal Personality Social Psychology, 39, 5 (1980), 752–756.
3. Chan, M. pui S., Jones, C.R., Hall Jamieson, K., and Albarracín, D. Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation. Psychological Science, 28, 11 (2017), 1531–1546.
4. Huang, J.Y. and Bargh, J.A. The Selfish Goal: Autonomously operating motivational structures as the proximate cause of human judgment and behavior. Behavioral and Brain Sciences, 37, (2014), 121–175.
5. Kelman, H.C. and Hovland, C.I. “Reinstatement” of the communicator in delayed measurement of opinion change. Journal of Abnormal and Social Psychology, 48, 3 (1953), 327–335.
6. Lewandowsky, S., Ecker, U.K.H., and Cook, J. Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. Journal of Applied Research in Memory and Cognition, 6, 4 (2017), 353–369.
7. Luo, X., Zhang, W., Burd, S., and Seazzu, A. Investigating phishing victimization with the Heuristic-Systematic model: A theoretical framework and an exploration. Computers and Security, 38, (2013), 28–38.
8. Petty, R.E. and Briñol, P. Persuasion: From Single to Multiple to Metacognitive Processes. Perspectives on Psychological Science, 3, 2 (2008), 137–147.
9. Petty, R.E. and Cacioppo, J.T. The Elaboration Likelihood Model of Persuasion. Advances in Experimental Social Psychology, 19, (1986), 123–205.
10. Stiff, J.B. and Mongeau, P.A. Persuasive communication. New York: Guilford Press, 2003.