What drives people to prefer health-related misinformation? The viewpoint of motivated reasoning.
Introduction. This paper examines the potential of the motivated reasoning approach as a framework explaining why people prefer and use health-related misinformation.
Method. Conceptual analysis of a sample of 41 studies drawing on the motivated reasoning approach examine the selection and use of information and misinformation.
Results. Preferring and using health-related misinformation occur most likely when people are primarily driven by directional goals. They tend to give rise to confirmation bias which favours the adherence to existing beliefs about the relevance of information sources of certain types, for example, websites advocating anti-vaccination ideas. Moreover, disconfirmation bias results in the rejection of information that challenges the existing beliefs about an issue. Directional goals seldom appear in a pure form because motivated reasoning is also driven by accuracy goals motivating people to select and use information that enables them to support, justify and defend their beliefs against critique.
Conclusion. Motivated reasoning offers a relatively robust psychological approach to the study of reasons by which people prefer and use misinformation in order to confirm their existing beliefs and to protect their identities. There is a need to explore further the potential and limitations of the motivated reasoning approach by conducting empirical research focusing on controversial and politicized issues such as climate change and the COVID-19 pandemic.
During the COVID-19 pandemic, the problems originating from the use of health-related misinformation have become more striking than before. For example, it is estimated that widespread misinformation about disinfecting the body and killing the coronavirus by drinking highly concentrated alcohol resulted in 800 deaths and 5,800 hospitalisations worldwide (Saiful Islam et al., 2020). Cases such as these offer concrete evidence about that people have exposed themselves to misleading information before making an ill-fated decision.
Since the 1980s, there is a growing literature examining the nature of misinformation and disinformation (Fox, 1983; Søe, 2021). Traditionally, researchers have departed from the assumption that the truth value is the key criterion by which mis/disinformation can be distinguished from real or genuine information. By this criterion, mis/disinformation is something that is necessarily false and misleading (Fallis, 2015; Floridi, 2011). More recently, researchers have emphasized the importance of misleadingness as a key feature of mis/disinformation. According to Søe (2018, p. 321), misinformation refers to ‘unintended misleading representational content’, while the content of disinformation is ‘intentionally (non-accidentally) misleading’. Thinking of the health context, the definition of misinformation proposed by Swire‐Thompson and Lazer (2020, p. 434) is particularly relevant: misinformation is information that is ‘contrary to the epistemic consensus of the scientific community regarding a phenomenon’. As to organizations representing epistemic consensus about the nature of the COVID-19 pandemic, for example, the World Health Organization (WHO) and Centers for Disease Control and Prevention (CDC), in the USA, are perhaps best-known authorities. They have published lists of false claims incorporating mis/disinformation about the pandemic and made attempts to rebut them (World Health Organization, 2021).
Thus far, the studies on health-related misinformation have mainly concentrated on the nature of information of this type, as well as the ways in which people distribute it in social media forums (e.g., Apuke and Omar, 2021; Su, 2021). However, there is a paucity of investigations examining the reasons by which people prefer and use misinformation. Interestingly, the models for human information behaviour developed so far have seldom devoted attention to this issue. Notable exceptions include the Social diffusion model of information, misinformation, and disinformation developed by Karlova and Fisher (2013) and the Social information perception model proposed by Ruokolainen and Widén (2020). Both models identify important elements of misinformation use but devote less attention to factors motivating people to prefer information that authorities such as WHO and CDC classify as misleading and deceptive.
To examine the above issue in greater detail, the present investigation focuses on motivated reasoning, a major psychological approach to explain the biased nature of human cognition. Motivated reasoning has become a central theoretical concept in academic discourse particularly because it elaborates the picture of selective exposure to information presented in the studies of cognitive dissonance (Carpenter, 2019). Motivated reasoning has been developed more systematically since the 1990s and has since that time been widely applied in diverse fields such as psychology, political science and communication research (Carpenter, 2019; Druckman and McGrath, 2019; Lodge and Taber, 2000). Motivated reasoning departs from the assumption that biases in perception, memory, judgment, belief, and choice are pervasive elements of human cognition. Such biases may manifest themselves in selective attention, distorted evaluation of things, wishful thinking and the use of heuristic shortcuts, for example. Often, researchers have characterized these biases negatively because they are indicative of the deviation of rational thinking (Kunda, 1990). As a theoretical approach, motivated reasoning explains such deviations by proposing that people are inherently selective in both their choice and processing of information because they want to protect their deeply ingrained beliefs and avoid views that are inconsistent with them (Druckman and McGrath, 2019; Kunda, 1990). Therefore, motivation of this kind may cause people to make self-serving attributions and permit them to believe what they want to believe simply because they want to believe it. In the context of the COVID-19 pandemic, the adherence to motivated reasoning offers a fertile ground for the use of misinformation if it is felt to be consistent with one´s beliefs, for example, that vaccines contain harmful chemicals that are not good for the body.
The present investigation contributes to information behaviour research by elaborating the picture of motivational factors driving the use of health-related misinformation. As the focus is placed on the reasons by which people prefer and use misinformation, the analytic distinction between mis- and disinformation discussed above is not critically important. It is assumed that misinformation is misleading information that is created and spread, regardless of whether there is intent to deceive, while disinformation refers to that part of misleading information that is created and spread with intent to deceive (Treen et al., 2020). These simple definitions suggest that disinformation is a subset of misinformation; therefore, studying misinformation by default includes disinformation. To examine the potential of motivated reasoning, a conceptual analysis was made by examining a sample of forty-one studies focusing on the ways in which people prefer sources of misinformation in health contexts in particular. This topic is important because we may think that despite of the availability of accurate health information offered by authorities such as WHO and CDC, an individual may refute such information and base his or her health-related decisions on false rumours distributed in social media forums. Motivated reasoning may make such decisions understandable from the viewpoint of the individual if mistrust in health authorities is an integral element of his or her worldview.
The rest of the article is structured as follows. First, to create background for the conceptual analysis, the nature of health-related misinformation, reasons for preferring such information and the main assumptions of the motivated reasoning approach are reviewed, followed by the specification of the research questions and methodology. The rest of the sections report the research findings and discuss their significance.
Features of health-related misinformation
There is a growing body of studies examining the features of health-related misinformation in diverse contexts such as medication, food, treatment of cancer and epidemics like HIV/AIDS (Krishna and Thompson, 2021). Previous studies also indicate that vaccination is one of the most popular topics of health-related misinformation (Kata, 2010; Wang et al., 2019). Recently, misinformation distributed in social media forums often deals with the COVID-19 pandemic. For example, Li et al. (2020) found that in a randomly selected sample of 145 YouTube videos, no less than 64% contained some misinformation about the COVID-19 vaccines. Recently, there is a number of COVID-19-related rumours, conspiracy theories and fake news which have been shown to be erroneous and thus carriers of misinformation (Pickles et al., 2021; WHO, 2021). Such misinformation beliefs, include, for example, that data on the effectiveness of COVID-19 vaccines are made up, 5G networks are spreading the coronavirus, injecting or ingesting bleach is a safe way to kill the coronavirus, and the flu shot provides immunity to COVID-19 (Pickles et al., 2021). In social media debates, the speculation about the side-effects of the COVID-19 vaccines is one of the hot topics offering a fertile ground for the creation and spreading of misinformation. This is mainly due to the fact that the coronavirus vaccinations were started quite recently; in particular, there is not enough objective knowledge about the long-term side-effects of the vaccines.
Therefore, it is important to define health-related misinformation relative to its temporal context in cases in which pandemics like COVID-19 evolve rapidly (Freiling et al., in press). What is true about the treatment for COVID-19 disease today may turn out to be false tomorrow, as exemplified by the debate about hydroxychloroquine as a promising cure. In March 2020, President Donald Trump declared hydroxychloroquine a 'game changer' in the effort to develop a coronavirus treatment and said that the drug had been approved (Ebbs, 2020). In fact, hydroxychloroquine is approved to treat other ailments, like malaria and rheumatoid arthritis, but it has not been specifically approved to treat COVID-19. Later on, major media outlets and fact-checkers attempted to clarify or even debunk Trump´s claims. Moreover, WHO suspended hydroxychloroquine from its global drug trials for COVID-19 treatments in May 2020, due to the safety concerns (Burke and Edwards, 2020).
Cases such as these raise a difficult question; what level of certainty or agreement must experts express before we can define what is misinformation or what it is not? (Vraga and Bode, 2020, p. 138) Even deciding what ultimately counts as misinformation about COVID-19 is a complicated matter, as insights into the causes of and treatments for the virus develop over time. This question is related to the assumption that although claims such as that UV rays kill the coronavirus may be false from the perspective of WHO and thus an example of harmful misinformation, some people may perceive such claims as meaningful and informative. For example, Søe (2018) asserts that the concepts of information, misinformation and disinformation are not neutral constructs because they incorporate normative elements. Information is the ‘true part that shall be preserved, guarded, enhanced, and spread. Mis/disinformation represent the false part that shall be avoided, combated, suppressed, and stopped’ (Søe, 2018, p. 321). Because health-related misinformation deviates from the dominant or generally accepted norms and attitudes in a society, it is dis-normative in nature (Ruokolainen and Widén, 2020). Information of this kind may draw on an individual´s experiential knowledge about the side-effects of a vaccine or the claims presented in online debates. Particularly in times of pandemics, this raises a moral question about deviating (dis-normative) health behaviour; for example, vaccine refusal drawing on a conspiracy theory, should be respected, tolerated, or condemned. Notwithstanding, independent of the answer to this question, researchers may make attempts to shed further light on the reasons by which people prefer misinformation over authoritative medical knowledge.
Reasons for preferring misinformation
Since the 1960s, researchers have examined factors motivating people to prefer misinformation and use it in everyday decision making. An early example is the rumour transmission theory proposed by Buckner (1965). It suggests that one of the reasons why people trust rumours is the perceived lack of relevant knowledge about an issue; a rumour offers at least some explanation for it. Later studies on the psychology of misinformation have revealed that people have a strong drive to accept inaccurate information if is consistent with their existing attitudes and beliefs (Kuklinski et al., 2000). Another explanation for people’s susceptibility to misinformation is the belief in conspiracy theories (Soveri et al., 2021). For example, individuals refusing to take coronavirus vaccine may draw on conspiracy beliefs asserting that the harmful effects of vaccines are being deliberately concealed from the public. Studies on the selective exposure to online news have shown, for example, that citizens with populist attitudes evaluate the news media more negatively, and there is also suggestive evidence that they rely less on established news sources like the legacy press (Stier et al., 2020). As Guess, Nyhan and Reifler (2018) demonstrate, selective exposure to misinformation available from fake news is particularly common in the context of political elections and presidential campaigns. Narayan and Preljevic (2017) examined personal blog writings whose authors believed in conspiracy theories advocating anti-vaccination ideas. The findings suggest that the preference for conspiracy theories of this kind can be explained by selective exposure to health-related information and avoidance of information that is inconsistent with their existing choices.
Many of the studies explaining why people become susceptible to misinformation draws on the assumption that people believe misleading information when they fail to sufficiently engage in deliberative reasoning processes (Ross et al., 2021). Furthermore, the reason why misleading information content is believed relates to its intuitive appeal. Highly emotional content that provokes moral outrage tend to draw people’s attention more strongly than the neutral description of facts. Compared to them, negative information tends to be more salient and is recognized more rapidly (Walter and Tukachinsky, 2020, pp. 161-162). For example, the claim that the measles, mumps, and rubella vaccine causes autism can be more memorable and compelling than factual information about the vaccine presented by WHO.
Misinformation may also be preferred over factual information because an individual wants to protect his or her personal identity as a proud nonconformist who cannot be told what to think. On this basis, some people intentionally place themselves outside the scientific consensus about the benefits of COVID-19 vaccines, for example (Hornsey, 2020). Finally, social identity can occupy a role, as an individual wants to support the views presented in anti-vaccination online communities (Hornsey, 2020). People living within online filter bubbles tend to follow certain social media sites that agree with their worldview and prejudices. Therefore, social media forums may turn into echo chambers reinforcing people's existing views and preventing them from critical thought and analysis (Agarwal and Alsaeedi, 2021, p. 643). In extreme cases, they only accept information is consistent with their opinions and reject other evidence if it is contrary to their beliefs.
Motivated reasoning as an approach to information selection and processing
Motivated reasoning has become a popular theoretical approach in misinformation research, particularly for issues with a strong partisan divide, for example, climate change and gun control (Berinsky, 2017; Bode and Vraga, 2015; Schaffner and Roche, 2017). Motivated reasoning is not a coherent theory; rather, is a framework that has made use of ideas obtained from many sources, most notably, the cognitive dissonance theory (Festinger, 1957). It assumes that people strive to be consistent in their attitudes and behaviours. To do so, people seek evidence that corroborates their existing beliefs and evaluate belief-confirming evidence more positively while rejecting conflicting evidence. Motivated reasoning also draws on cognitive studies on the need for closure. It refers to ‘the desire for a definite answer on some topic, any answer as opposed to confusion and ambiguity’ (Kruglanski, 1989, p. 14). People with high need for closure are more likely to recall stereotype-consistent information and selectively expose themselves to information that support their previous decision.
The key ideas of motivated reasoning were first synthetized by Kunda (1990). Since that time, the assumptions of motivated reasoning have been used in diverse domains such as communication research, psychology, and political science. Motivated reasoning departs from the assumption that goals affect reasoning through reliance on a biased set of cognitive processes, that is, strategies for accessing, constructing, and evaluating beliefs (Kunda, 1990). It is further assumed that the ways in which people identify, select and process information is driven by goals of two types. First, directional goals lead to the use of beliefs and strategies that are considered most likely to yield a preselected and desired conclusion. To illustrate the nature of directional goals, we may take a fictional example of how an anti-vaxxer (a person who is opposed to vaccination) attempts to maintain a desirable conclusion that COVID-19 vaccines are ineffective. Driven by this goal, he or she seeks information that supports the above conclusion and refutes information that challenges it in some way. Second, accuracy goals lead to the use of those beliefs and strategies that are considered most appropriate in situations in which individuals are motivated to provide accurate responses to justify their preferences to others. For example, an anti-vaxxer may draw on official statistical data to support his or her sceptical view on COVID-19 vaccines. The data may be used to demonstrate that despite of taking two shots of coronavirus vaccine, people can still be infected and hospitalised.
The motivated reasoning approach assumes that the intensity by which accuracy and directional goals drive human behaviour varies situationally, depending on the issue at hand. When people are motivated to be accurate, they expend more cognitive effort on issue-related reasoning, attend to relevant information more carefully, and process it more deeply (Kunda, 1990, p. 481). People also consider more alternatives and draw more connections among diverse characteristics, instead of drawing on hasty heuristic thinking. In contrast, when reasoning is driven by directional goals, people search memory for those beliefs and rules that could support their desired conclusion (Kunda, 1990, pp. 482-483). To achieve this, they may creatively combine accessed knowledge to construct new beliefs that could logically support the desired conclusion. However, as Kunda (1990. p. 490) reminds, people are not at liberty to believe anything they like; they are constrained by their prior beliefs about the acceptability of various procedures such as laboratory tests indicative of objective evidence of an event.
The above setting is rendered more complicated because accuracy and directional goals may not appear in pure and ideal typical forms in real-life situations. We may think that in most situations people are motivated, in some measure, to be accurate, but they are unable completely to ignore their prior beliefs and affects toward an object (Lodge and Taber, 2000, p. 187), This gives rise to the tension between the drives for optimal accuracy and belief perseverance. On the other hand, directional and accuracy goals may be conceptualized as ends of a continuum; people may pursue both goals so that they form a mixture (Brenes Peralta et al., 2021). Thus, the strength of each goal determines whether an individual in a certain context is situated in one or the other end of the motivation continuum. However, if both motivations are strong, individuals must compromise between their wish to reach a desired conclusion and the plausibility of that conclusion being true (Kunda, 1990). This raises a difficult question about the acceptability of motivated reasoning in cases in which an individual´s health-related decisions are primarily driven by directional goals. Motivated reasoning of this kind is often evaluated negatively because it deviates from rational thinking and can result in dangerous illusions. This is particularly evident in cases in which reasoning primarily driven by accuracy goals could facilitate avoiding unnecessary health risks. For example, people drawing on strong directional goals can play down the seriousness of the COVID-19 disease; in extreme cases, however, they may literally pay with their lives for their motivated reasoning.
Research questions and method
The review of earlier studies suggests that the reasons why people prefer misinformation and actually use it in decision making is a multifaceted issue. As discussed above, motivated reasoning is one of the promising approaches to deepen our understanding about the drivers of health-related information behaviour in cases in which it deviates from the normative views advocated by health authorities. To examine the potential of the motivated reasoning approach in greater detail, the present study seeks answers to the following questions.
RQ1. In which ways does the motivated reasoning approach conceptualize the motivating factors by which people prefer health-related misinformation?
RQ2. What are the strengths and limitations of the motivated reasoning approach in the study of the above issues?
To identify relevant research material, eleven major databases were searched: ACM Digital Library, Academic Search Ultimate (Ebsco), Communication & Mass Media Complete (Ebsco), Google Scholar, LISA, Sage Journals Online, Science Direct, Scopus, Springer Link, Taylor & Francis Online, and Wiley Online Library. The searches were directed to the abstracts of peer reviewed studies using six key search terms: motivated reasoning, misinformation, disinformation, health-related information, information seeking, and information use. The searches identified 127 potentially relevant investigations published within the period of 1990-2021. A more detailed reading of the material indicated that from the original sample, forty-one items are relevant for the research questions specified above. These investigations were chosen for analysis using three selection criteria so that an individual study included in the final sample of forty-one investigations meet at least one of the following criteria. First, a study explicitly characterizes the features of motivated reasoning as an approach to human motivation (e.g., Kunda, 1990; Druckman and McGrath, 2019). Second, an investigation depicts how motivated reasoning manifests itself in human behaviour, for example, political reasoning (e.g., Lodge and Taber, 2000) or health-related behaviour (e.g., Sylvester, 2021). Third, a study examines how people prefer misinformation about an issue, for example, vaccination (e.g., Kata, 2010) or politics (Berinsky, 2017). The forty-one investigations comprising the research material are marked with an asterisk (*) in the list of references.
As the above criteria indicate, not all of the forty-one studies focus on health-related information behaviour or health-related misinformation. For example, Druckman and McGrath (2019) discuss how motivated reasoning explain people’s approach to climate change, while Richey (2012) conceptualizes motivated reasoning in political information processing. However, investigations examining issues other than health are relevant for the present study because they deepen our understanding about why people prefer health-related misinformation. This is because biased nature of human cognition is common to diverse phenomena such as climate denialism, partisan political thinking and vaccine refusal, for example. Thus, the ways in which motivated reasoning drives climate denialists to prefer misinformation can be also make understandable why vaccine sceptics prefer rumours denying the effectiveness of vaccines (Druckman and McGrath, 2019; Kata, 2010). The findings section offers further examples of how conceptual approaches to motivated reasoning developed in fields such as political science and psychology can be employed to shed additional light on the nature of misinformation preferences appearing in health-related behaviour. Said otherwise, conceptualizations and models of motivated reasoning developed in other fields can be translated to the study of health-related misinformation seeking and use.
Since the study focuses on the potential of the motivated reasoning approach as a theoretical framework, conceptual analysis was deemed most appropriate for the present investigation. This is because conceptual analysis allows us to distinguish between the defining attributes of a concept and to examine their relationships. Drawing on Furner (2004), this method can be defined as an approach that treats concepts like motivated reasoning or sub-concepts such as directional goal, as classes of objects, events, properties, or relationships. More specifically, conceptual analysis involves defining the meaning of a given concept by identifying and specifying the contexts in which any entity or phenomenon is classified under the concept in question. More specifically, mirroring the content of research questions 1 and 2, the sample of forty-one studies was analysed by devoting attention to how researchers have characterized motivated reasoning as an approach explaining how people select and prefer sources of information and misinformation; andthe strengths and limitations of the motivated reasoning approach.
To achieve this, the research material, for example, Druckman and McGrath (2019), Pennycook and Rand (2019) and Sylvester (2021) was first read several times in order to identify the individual characterizations of the main concept, that is, motivated reasoning, as well as text portions (paragraphs and sentences) describing how motivated reasoning explains why people prefer information of certain type, including misinformation. The text portions thus identified were then subjected to open coding to identify the sub-categories describing the features of motivated reasoning, for example, accuracy goal, directional goal, confirmation bias and disconfirmation bias. On this basis, a coding scheme was developed to finalise the coding in a systematic way; the coding scheme is presented in the Appendix. The coding was continued by identifying cases in which the above categories occur together with the concepts of misinformation, disinformation, information seeking, information selection, and information use. Many studies of this kind focus on the seeking and use of political information (e.g., Redlawsk, 2002; Richey, 2012; Taber and Lodge, 2006) and climate change (e.g., Druckman and McGrath, 2019). In the domain of health-related information behaviour, the ideas of motivated reasoning have mainly been used in studies focusing on the COVID-19 pandemic (e.g., Freiling et al., in press, Pennycook et al., 2020; Sylvester, 2021). The coding was finalized by identifying text portions in which researchers characterize the strengths and limitation of the motivated reasoning approach.
The conceptual analysis of the coded material was based on the constant comparative approach (Lincoln and Cuba, 1985, pp. 339-344). To achieve this, similarities and differences in the above studies characterizing the attributes of the motivated reasoning approach were identified and scrutinized. To this end, it was analysed how researchers have characterized, for example, directional goals as drivers for seeking support from rumours, conspiracy theories and misinformation of other types. Moreover, similarities and differences were analysed regarding how researchers have evaluated the strengths and limitations of the motivated reasoning approach as a framework explaining why people prefer information of certain type. In this analysis, the main attention was directed to the latter aspect simply because there is more research material critically reflecting the applicability of the motivated reasoning approach (e.g., Carpenter, 2019; Druckman and McGrath, 2019; Pennycook and Rand, 2019).
Motivated reasoning as an explanation for the preference of health-related misinformation
Overall, the analysis of the research material indicated that the preference for motivated reasoning is particularly evident in situations in which an individual has to consider his or her view on a controversial issue, for example, climate change, gun control or vaccination. In such situations, motivated reasoning may affect the ways in which he or she selects and prefers sources of information, as well as how the information content is evaluated. According to Avnur (2020, p. 580), motivated reasoning deals primarily with the processing and evaluation of information obtained from a source, while the selection of information sources deals mainly with information exposure. In real-life situations, however, the processes of selecting, preferring and using information tend to be intertwined. Therefore, it is assumed that the motivated reasoning approach can affect the whole process, ranging from the selection of information sources to the utilisation of the (mis)information content.
Theoretically, the predisposition to motivated reasoning rests on the origin of the cognitive and affective values attached to the judgments of new information encountered by an individual (Richey, 2012, p. 515). It is assumed that ultimately, all social concepts are affectively laden and that all social information is affectively charged (Lodge and Taber, 2000, p. 183). Among anti-vaxxers, for example, the COVID-19 vaccination campaigns may elicit negative emotions such as irritation and anger. This is because the initial judgment of information offered by such campaigns is oriented by the “how-do-I-feel?" heuristic (Lodge and Taber, 2000, p. 184). It captures the affective tally attached to the concept and moves it into working memory, thereby signalling the affective colouration of the object, telling the individual at the moment of recognition how much she or he likes or dislikes it, for example, the recommendation that all adults should take the COVID-19 vaccine. While considering controversial issues in particular, motivated reasoning stems from the interaction of cognition and affect when both positive and negative feelings come to mind automatically and bias subsequent information processing, resulting in the hot cognition nexus with memory (Richey, 2012, p. 515). When a judgment is recalled to update it cognitively, based on new information, that recall automatically activates an affective marker attached to the initial judgment stored in long-term memory. In this context, motivated reasoners make an immediate evaluation (like/dislike) of a piece of information they encounter, maintaining a tally which summarizes the current affect toward the object, for example, vaccination (Redlawsk 2002, p. 1023). Thus, the memory node for vaccination contains not only cognitive information but also the affective tally which is updated immediately upon the acquisition of new information.
If the above process is predominantly driven by a directional goal, it leads an individual to seek out evidence that aligns with his or her pre-existing views, thus resulting in a confirmation bias. Therefore, new information is likely to reinforce that belief. Confirmation bias appears, for example, when an anti-vaxxer ignores sources such as the WHO’s website debunking false claims about COVID-19 vaccines and instead frequents online discussion groups spreading rumours about the harmful side-effects of vaccines. In addition, a disconfirmation bias occurs when people place greater scrutiny on, or actively generate counterarguments against, information that undermines a desired conclusion. In this case, an individual makes attempts to dismiss new information that contradicts his or her ingrained beliefs and evaluates arguments that align with his or her views as stronger and more accurate than opposing arguments (Taber and Lodge, 2006). Moreover, rather than learning from exposure to new information, individuals who encounter new information that contradicts their prior perspective often become even more favourable to their prior beliefs. This is because information that is more distant from the individual’s prior belief is perceived as weaker and thus receives little weight in the process of updating one´s knowledge about an issue (Druckman and McGrath, 2019, p. 113). In contrast, information closer to the individual’s prior belief is perceived as stronger and thus receives greater weight in the updating process.
All in all, a major characteristic of judgments driven by directional goals is belief-protective reasoning (Druckman and McGrath, 2019, p. 116). Directional reasoning may also take another form: it can involve an identity-protective goal, rather than the maintenance of a particular belief as the desired outcome. In this case, new information is evaluated as either threatening or non-threatening to one’s identity as an anti-vaxxer or climate denialist, for example. Studies on the use of political misinformation have demonstrated that the adherence to a directional goal can result in a strongly biased interpretation of the state of affairs. For example, when asked their views in surveys, many conservatives may understand that Barack Obama was, in fact, born in the United States but choose to support the widespread rumour that he is not a US citizen as a way of bolstering their opposition to him (Schaffner and Roche, 2017, p. 88). This phenomenon is known as partisan cheerleading; people supporting a political ideology know the correct answer to knowledge questions, but they intentionally prefer an incorrect, party-congenial response to support their side (Peterson and Iyengar, 2021). Similarly, an anti-vaxxer may be well aware of the medical evidence demonstrating that coronavirus vaccines can prevent the disease or at least alleviate its worst symptoms. Nevertheless, to protect his or her identity as a proud non-conformist in the eyes of his or her like-minded friends, he or she refuses to admit this fact publicly.
Interestingly, as the above example illustrates, the preference for a directional goal contradicts an accuracy-motivated process where one evaluates information in an objective manner, independent of its relationship to the belief in question. This contradiction manifests itself during the process of updating of beliefs against new information, not in the individual’s overall prior or posterior beliefs (Druckman and McGrath, 2019, p. 113). For example, if a directionally motivated anti-vaxxer receives two pieces of information: a scientific report on the side-effects of COVID-19 vaccines published by a health authority and a fake news article reporting a horror story of the risks involved in vaccination, a prior attitude effect means that the individual assesses the scientific report as weak evidence and the fake news article as strong evidence. This is because his or her goal is to evaluate evidence in a way that confirms his or her anti-vaccination beliefs. The result is a posterior belief that remains sceptical about the benefits of vaccination.
Conversely, an individual who is accuracy-motivated may reject the scientific report due to low trust in medical science and accept the fake news due to trust in the news source (Druckman and McGrath, 2019, p. 113). Thereby, an accuracy-motivated individual arrives at the same posterior belief as the directionally motivated individual, not from motivation to confirm a prior belief, but from an appraisal of what source is credible. In fact, an individual may not only discredit the evidence offered by a scientific report through a prior attitude effect but also think of contrary evidence, leading to a posterior belief of even greater scepticism.
The relationship between accuracy and directional motivation can be elaborated further by reviewing the typology proposed by Lodge and Taber (2000). They suggest that a relative mix of accuracy and directional goals manifests itself in different information-processing strategies and decision steps. Crossing the two goals by their intensity (directional: weak, strong x accuracy: weak, strong) produced four ideal types of reasoning. Lodge and Taber (2000, p. 187) labelled them as partisan reasoner (strong directional, weak accuracy goals), intuitive scientist (strong directional, strong accuracy goals), classical rationality (weak directional, strong accuracy goals), and low motivation (weak directional, weak accuracy goals). Nir (2011, p. 507) slightly modified the above typology by renaming the type of classical rationality as classical rationalist and the type of low motivation as apathetic. More specifically, a partisan reasoner seeks to justify a preferred conclusion and exhibits strong confirmation or disconfirmation biases in information processing. In contrast, an apathetic has low motivation, draws on heuristic processing, or possibly does not process information in greater detail. To compare, the intuitive scientist seeks accurate conclusion within subjective limits and actively adjusts for bias. Finally, the classical rationalist (a kind of “Enlightenment man”) makes use of reasoning as dispassionate calculation. This type represents an ideal for an informed and deliberative citizen whose decision making is relatively free from biases (Nir, 2011, pp. 506-509). It is assumed that driven by a strong accuracy goal, classical rationalist gathers relevant information, ‘regardless of the directional push of the evidence’ (Lodge and Taber, 2000, p. 206), Ideally, the individual makes an even handed evaluation of the evidence and postpones his or her judgment until the evidence is deemed to be sufficient.
The above typology can be used to characterise the diverse ways in which people approach vaccine-related misinformation. It is evident that the features of the partisan reasoner may describe best the preferences of a “hard-core” anti-vaxxer characterised by a kind of partisan cheerleading. He or she is likely to be particularly susceptible to health-related misinformation because the accuracy motivation is secondary in reasoning. An individual characterized as an intuitive scientist may feel deep antipathy towards vaccination but at the same time be interested to acquire further information; both medical facts and misinformation to bolster his or her critical view. On the other hand, it is possible that people choosing not to get vaccinated can be classified in the subgroup of apathetic in that they are largely indifferent about the issues of personal health care, thinking that vaccination is simply not an issue worth particular attention. Finally, the classical rationalist deviates most strongly from partisan thinking because he or she prefers factual information over deeply ingrained beliefs or prejudices about vaccines. It is also evident that an individual driven by strong accuracy motivation and weak directional goals is less willing to make use of misinformation because it may not meet the criterion of objectivity.
COVID-19 has become a particularly interesting case for the study of motivated reasoning as a driver of misinformation use because the pandemic has become a partisan issue particularly in the United States (Sylvester, 2021). This means that people often view the COVID‐19 issues through a political lens. This makes it more likely that directional goals predominate when people select and use health-related information. To examine the relationship between COVID-19 knowledge, policy preferences, and health behaviour intentions, Sylvester analysed the survey data gathered from over 7,000 U.S. adults in July 2020. It appeared that motivated reasoning provides a valuable framework for understanding why individuals would dismiss basic coronavirus facts. The findings reveal that conservatives are more likely to accept false beliefs about the COVID-19 pandemic. Nevertheless, while conservatives were more likely to accept misinformation of this kind, they still answered most coronavirus questions correctly. This suggests that they are not merely driven by partisan directional goals, but accuracy motivation also directs their views. This reflects the tension between directional and accuracy goals discussed above.
The findings obtained from the use of political information offer an interesting parallel to the above issue. In an experimental study Redlawsk, Civettini, and Emmerson (2010) demonstrated that individuals exposed to political information opposite their prior beliefs may reach an affective tipping point where they become more willing to reconsider their views, thereby mitigating the potential effects of motivated reasoning. At some point people recognize that they are possibly wrong, and begin making adjustments, particularly if they become anxious about the consequences of their future action, for example, voting on his or her previous favourite candidate who is currently rumoured to have evaded taxes. In this case, people are no longer driven by a partisan goal only because an increasing attention is devoted to accuracy motives prompting people to judge the credibility of such rumours. Even though voting on a candidate suspected of malpractice may be less consequential for an individual than vaccine refusal, for example, the features of the affective tipping point can be mirrored against Sylvester’s (2021) study discussed above. We may think that, similarly, despite the growing dissemination of COVID-19 vaccine misinformation, people who are sceptical about the effectiveness of coronavirus vaccines are also continually exposed to compelling (medical) evidence presented in the media. The evidence may demonstrate that these vaccines can prevent the disease or alleviate its symptoms so that patients can avoid hospitalisation. This evidence - combined with the growing worry about the severity of the disease - can give rise to an affective tipping point where people reconsider their sceptical views. At that point they stop reinforcing their ingrained beliefs, abandon partisan reasoning, and possibly begin rational, fact-based thinking about the pros and cons of vaccination.
The strengths and limitations of the motivated reasoning approach
One of the strengths of the motivated reasoning approach is that it is based on over three decades of conceptual and empirical work conducted in diverse domains such as psychology, political science, and communication research (Carpenter, 2019; Druckman and McGrath, 2019). The motivated reasoning approach has elaborated the picture of inherent biases of human cognition by demonstrating that goals pursued by an individual have a critically important effect on how he or she selects and uses information. In particular, the specification of the nature of directional and accuracy goals have brought a new dimension to studies of human cognition (Kunda, 1990). Moreover, the analysis of the relationships between the above goals has deepened our understanding about why confirmation and disconfirmation biases function as in-built mechanisms of human cognition. So far, the above strengths can mainly be identified in the domains of political science where the ideas of motivated reasoning have been utilized since the 1990s in the study of partisan thinking. More recently, these ideas have been found useful to explain people´s preference for misinformation and the difficulties faced in the attempts to debunk false rumours and fake news (Freiling et al., in press; Lewandowsky et al., 2012).
Nevertheless, the motivated reasoning approach has its limitations. For example, it cannot offer an all-embracing explanation for why people prefer misinformation about the COVID-19 pandemic. First, how people approach sources of health-related misinformation is not solely an individual-level phenomenon. The preference or rejection of misinformation may be affected by social factors, for example, opinions presented by like-minded people advocating vaccine refusal in online debates. In this case, social-consensus information is particularly powerful motivator when it pertains to one’s reference group, for example, anti-vaxxers (Lewandowsky et al., 2012). Second, biased selection and use of information may also be explained by other factors such as the lack of analytical thinking. For example, while seeking and using COVID-19 related information, a greater level of specialized knowledge and more sophisticated skills of analytical thinking are required to correctly judge its accuracy compared to the accuracy of political information needed for a voting decision (Pennycook et al., 2020, p. 771). Thus, people may be unable to discern truth from falsehood in the context of COVID-19, even when they are driven by accuracy goals. On the other hand, much of the COVID-19 misinformation circulating online is not primarily political in nature and thus subject to strong partisan beliefs, for example, the claim that COVID-19 can be cured by Vitamin C. Thus, partisanship manifesting itself in the directional goal would not be the main factor when people decide whether to make themselves susceptible to COVID-19-related misinformation. On the contrary, one might reasonably expect that the life-and-death context of COVID-19 would make people to devote their primary attention to accuracy of information.
However, for ordinary people this is not an easy task. A survey with more than 1,700 U.S. adults conducted by Pennycook et al. (2020) revealed that people share false claims about COVID-19 partly because they simply fail to think sufficiently about whether or not the information content is accurate. This suggests that lay people are largely incapable of discerning whether and to what extent a health-related rumour contains true and false elements. Contrary to the assumptions of the motivated reasoning approach suggesting that people tend to fail to discern true information from misleading information because their reasoning abilities are hijacked by political motivations, the explanation is rather that people do not stop to reflect sufficiently on their prior knowledge or have insufficient prior knowledge about an issue (Pennycook and Rand, 2021, p. 391). This critical finding suggests that motivational reasoning is not the only driver for preferring and using use health-related misinformation because these phenomena can also be explained by related factors such as the lack of analytical reasoning.
The present study elaborated the picture of reasons by which people prefer health-related misinformation. To this end, the potential of the motivated reasoning approach was examined by focusing on a sample of forty-one studies. Overall, the findings suggest that as a theoretical approach to the study of cognitive biases, motivated reasoning offers a credible explanation for why people make themselves susceptible to misinformation and use it in health-related decision making. It is suggested that ultimately, this preference can be traced to people´s propensity to uphold their value-based beliefs about an issue, as well as protecting their personal and social identity. From this perspective, it is evident that motivated reasoning incorporates a conservative element which makes it difficult for an individual to abandon his or her deeply ingrained preconceptions.
The first research question dealt with the ways in which the motivated reasoning approach conceptualises the motivating factors by which people prefer health-related misinformation. The findings highlight that the biased selection of information, exposure to misinformation and its use is more likely to occur when people are primarily driven by directional goals. In this case, the role of two major cognitive mechanisms is emphasised. First, confirmation bias favours the adherence to existing beliefs about the relevance of information sources of certain types, for example, websites advocating anti-vaccination ideas. Second, disconfirmation bias results in the rejection of information that contradicts or challenges the existing beliefs.
However, directional goals seldom appear in a pure form because motivated reasoning is also driven by accuracy goals. They motivate people to select and use information that enables them to support, justify and defend their beliefs against critique. For example, driven by accuracy goals, an individual may prefer information that highlights the severe side-effects of COVID-19 vaccines. As proposed by the typology developed by Lodge and Taber (2000), the relative strength of the directional and accuracy goals may vary. Partisan reasoning represents an extreme form of motivated reasoning because it is dominated by directional goals, directly opposite to the type of “classical scientist” which prefers accuracy over partisan views.
The second research question examined the strengths and limitations of the motivated reasoning approach. The findings demonstrate that since the 1990s, the motivated reasoning approach has established its place a theoretical framework within political science, psychology and communication research in particular. The constructs of accuracy goals, directional goals, conformation bias and disconfirmation bias have been operationalized and used in numerous questionnaire surveys and experimental studies. Overall, the major strength of the motivated reasoning approach is that it elaborates the picture of the biases of human cognition by explaining how they depend on motivational factors, that is, the features of goals adopted by the individual. From this perspective, the motivated reasoning approach has been able to refine and complement our understanding about the nature of cognitive consonance depicted in Festinger´s (1957) cognitive dissonance theory in particular.
The findings also demonstrate that as a framework explaining why people behave as they do, motivated reasoning has its limitations. Studies of the use of political misinformation have shown that its acceptance may originate from the lack of analytical thinking, rather than the tendency to adhere to politically concordant information content supporting an individual´s partisan beliefs (Pennycook and Rand, 2019). This critical view suggests that ultimately, people fall for misinformation because they fail to think; not because they think in a motivated way to protect their identity. Thus, even among people who are inclined to be driven by directional goals, analytic thinking may support the override of the prior (biased) beliefs, as opposed to the exacerbation of them.
Another criticism of the motivated reasoning approach is that it is often difficult to specify the effect originating from directional goals because they may be confounded with other relevant variables. Most importantly, as Pennycook and Rand (2021, p. 391) have argued, partisans differ in what they believe about the world, even when it comes to their factual beliefs that relate to empirical evidence about global warming or the effectiveness of COVID-19 vaccines. It is possible that more cognitively sophisticated individuals, particularly those of the type of the classical scientist may place more weight on their prior factual beliefs when evaluating new evidence dealing with the risks of a COVID-19 vaccine, for example, instead of placing more weight on concordance with their political identities. On the other hand, more empirical research is needed to specify how the effect coming from the pursuance of directional and accuracy goals is related to other factors driving people to behave as they do in cases when they prefer misinformation over sources of factual information.
The potential of the motivated reasoning approach can be reflected further by making a few comparative notions. First, as a framework explaining why people prefer misinformation, the approach to cognitive arrest offers an interesting parallel (Kim and Grunig, 2021). In general, cognitive arrest refers to an individual´s uniform cognitive and communicative motion in the same retrospective direction (Kim and Grunig, 2021, pp 232-233). Therefore, cognitive arrest is indicative of a machine-like cognitive action which moves from a pre-set conclusion to the optimization of evidence through information forefending. It is a major characteristic of cognitive arrest that in a decision-making situation a person internally activates or externally acquires a new pet hypothesis, for example, that COVID-19 vaccines include harmful substances for the body, and then confers some extent of plausibility to it. Pet hypothesis is similar to a partisan belief, something that is upheld and protected by adhering to a directional goal. Later on, further support is sought to the pet hypothesis by preferring information that fits with the hypothesis. As the cognitive arrest continues, it encourages a growing sense of confidence in the preferred epistemic conclusion, for example, that COVID-19 vaccines are not good for the human body.
Similar to motivated reasoning, cognitive arrest often draws on conspiratorial thinking which is a recurrent process of believing and warranting (Kim and Grunig, 2021, pp. 234-235). This is because almost all social conflicts and pandemics tend to beget new conspiracy hypotheses, as perplexed lay publics seek explanations fitting their predispositions. The more people are puzzled, the more they may engage in conspiratorial thinking to find new hypotheses that are useful to account for their cognitive puzzles. At the same time, attempts are made to filter out disputable facts and incompatible evidence. This backward inference from pre-set ideas to information by means of cognitive retrogression is directional, remains selective in terms of information seeking and use, and often becomes a slippery slope to deeper conspiratorial thinking and greater conviction. This results in that the person is inert in modifying or improving problem-solving approaches to thinking about a problematic situation. As a result, the individual is likely to behave like a programmed cognitive machine and become resistant to (new) correcting information.
Common to individuals locked in cognitive arrest and those driven by strong partisan reasoning is that they exhibit the features of lacuna individuals (Krishna, 2017). The word lacuna means a gap, or a missing part, and the missing part among lacuna individuals is their knowledge. Knowledge deficiency, in the meaning of the absence of scientifically legitimized knowledge, is key to the conceptualization of lacuna individuals. More specifically, a lacuna individual is ‘one who is very high in his/her problem-specific motivation and activeness levels about an issue, but displays a knowledge deficiency about that issue, and high levels of negative attitudes about it’ (Krishna, 2017, p. 180). The above characterization is particularly descriptive of antivaccine activists who misinterpret and reject biomedical and scientific data, and present scientifically inaccurate data as facts (Kata, 2010). Given the strong conviction of many hard-core anti-vaxxers, it is evident that they are not persuadable, no matter the amount of accurate vaccine information provided (Kata, 2010, p. 1715). Similarly, Krishna (2017, p. 180) noted that knowledge deficiency is the key to understanding why individuals display high levels of negative attitudes about vaccines in the United States. In extreme cases, lacuna individuals form groups of ‘a hard core of chronic know nothings, a segment of the population who intentionally shut themselves off from information distributed by health authorities’ (Krishna, 2017, p. 180).
Although the construct of cognitive arrest primarily conceptualizes information selection and use in the context of decision making, the above framework offers complementary ideas explaining why people prefer biased information obtained from conspiracy theories in particular. It is also evident that lacuna individuals locked in a cognitive arrest are particularly characteristic of hard-core partisan reasoners depicted by Lodge and Taber (2000). Overall, the construct of cognitive arrest sheds additional light upon the features of motivated reasoners whose information selection and use are based on strong confirmation and disconfirmation biases.
The potential of the motivated reasoning approach may also be mirrored against the assumptions of the cognitive dissonance theory (Festinger, 1957). It proposes that if a person holds two cognitions that are inconsistent with one another, he or she will experience a state of dissonance. For example, when an anti-vaxxer is exposed to positive information about the benefits of COVID-19 vaccination, he or she experiences cognitive dissonance because such information contradicts his or her pre-existing beliefs on vaccination risks. The state of dissonance then triggers negative emotions: one of the ways to reduce them is to acquire information that confirms his or her exiting beliefs about the above issue. Therefore, similar to the motivated reasoning approach, the cognitive dissonance theory assumes that people strive to be consistent in their attitudes and behaviour. To do so, people naturally seek evidence that corroborates their existing beliefs and evaluate belief-confirming evidence more positively due to the confirmation bias, while rejecting conflicting evidence on the basis of disconfirmation bias. From the perspective of cognitive dissonance theory, this preference stems from the refusal to admit that the information previously accepted by the individual would be false (Sui and Zhang, 2021). Thus, conflict with existing knowledge reduces the likelihood that it will be successfully corrected. Accordingly, compatibility with other knowledge increases the likelihood that misleading information will be accepted and decreases the likelihood that it will be successfully corrected (Lewandowsky et al., 2012, p. 112). More generally, attitudes and beliefs that are linked to overarching values or lifestyles, that is, attitudes for which there is high ego-involvement, are particularly strong and difficult to change (Blankenship and Wegener, 2008).
The above notions suggest that the cognitive dissonance theory and the motivated reasoning approach have much in common in that both frameworks emphasize the importance of confirmation and disconfirmation bias. The major difference is that the cognitive dissonance theory focuses on the cognitive operations by which an individual attempts to maintain the cognitive consonance and get rid of dissonance, while the motivated reasoning approach emphasizes the importance of driving goals (motives) by which people try to make their choices and preferences understandable and justifiable. The motivated reasoning approach also offers a more sophisticated picture of the interaction of cognitive and affective factors in terms of hot cognition. This approach also highlights the importance of affective factors because motivated reasoning is oriented by the how-do-I-feel heuristic particularly when an individual encounters affectively incongruent information about an issue (Redlawsk, 2002, pp. 2013-2014). Looked as a whole, however, the cognitive dissonance theory and the motivated reasoning approach are complementary, rather than alternative frameworks.
Motivated reasoning offers a relatively robust psychological approach to the study of reasons by which people prefer and use misinformation in order to confirm their existing beliefs and to protect their identities. Therefore, the ideas of motivated reasoning explaining people´s partisan views on diverse issues such as climate denial and political elections can be translated to health-related phenomena like vaccine refusal. This is because in domains such as these, polarization and the availability of contradictory information play a significant role. The findings of the present study suggest the motivated reasoning approach can deepen our understanding about why people expose themselves to misleading claims about the coronavirus vaccines, for example, and how they use misinformation while considering whether or not to take the vaccination.
There is a need to explore further the potential and limitations of the motivated reasoning approach by conducting empirical research focusing on controversial issues such as the COVID-19 pandemic. So far, many of the empirical studies drawing on the motivated reasoning approach are based on experimental settings (e.g., Pennycook and Rand, 2019). However, the potential of the above approach should also be examined in real-world surveys focusing on everyday situations in which people seek and use information, driven by the mix of directional and accuracy goals. Investigations such as these would also shed additional light upon the criteria by which people prefer dis-normative information such as rumours distributed in online debates and refute normative or officially accepted information offered by authorities.
About the author
Reijo Savolainen is Professor Emeritus at the Faculty of Information Technology and Communication Sciences, Tampere University, Kanslerinrinne 1, FIN-33014 Tampere, Finland. He received his PhD from University of Tampere in 1989. His main research interests are in theoretical and empirical issues of everyday information practices. He can be contacted at Reijo.Savolainen@tuni.fi
Note: Items included in the sample of 41 investigations comprising the research material are marked with an asterisk (*). A link from the title, or from (Internet Archive) is to an open access document. A link from the DOI is to the publisher's page for the document.
- * Agarwal, N.K. & Alsaeedi, F. (2021). Creation, dissemination and mitigation: toward a disinformation behavior framework and model. Aslib Journal of Information Management, 73(5), 639-658. https://doi.org/10.1108/AJIM-01-2021-0034
- * Apuke, O.D. & Omar, B. (2021). Fake news and COVID-19: modelling the predictors of fake news sharing among social media users. Telematics and Informatics, 56, paper 101475. https://bit.ly/3K7Hf8l https://doi.org/10.1016/j.tele.2020.101475
- * Avnur, Y. (2020). What’s wrong with the online echo chamber: a motivated reasoning account. Journal of Applied Philosophy, 37(4), 578-593. https://doi.org/10.1111/japp.12426
- * Berinsky, A.J. (2017). Rumors and health care reform: experiments in political misinformation. British Journal of Political Science, 47(2), 241-262. https://doi.org/10.1017/S0007123415000186
- * Blankenship K.L. & Wegener D.T. (2008). Opening the mind to close it: considering a message in light of important values increases message processing and later resistance to change. Journal of Personality and Social Psychology, 94(2), 196-213. https://doi.org/10.1037/0022-3514.9126.96.36.199
- * Bode, L & Vraga, E.K. (2015). In related news, that was wrong: the correction of misinformation through related stories functionality in social media. Journal of Communication, 65(4), 619-638. https://doi.org/10.1111/jcom.12166
- * Brenes Peralta, C., Wojcieszak, M. & Lelkes, Y. (2021). Can I stick to my guns? Motivated reasoning and biased processing of balanced political information. Communication and Society, 34(2), 49-66. https://revistas.unav.edu/index.php/communication-and-society/article/view/37876/35079 https://doi.org/10.15581/003.34.2.49-66,
- Buckner, H.T. (1965). A theory of rumor transmission. Public Opinion Quarterly, 29(1), 54-70. https://doi.org/10.1086/267297
- Burke, K. & Edwards, E. (2020, June 5). Lancet pulls paper after ‘major flaws’ found in coronavirus hydroxychloroquine research. 7 News. https://7news.com.au/lifestyle/health-wellbeing/lancet-pulls-paper-after-major-flaws-found-in-coronavirus-hydroxychloroquine-research-c-1081149. (Internet Archive)
- * Carpenter, C.J. (2019) Cognitive dissonance, ego-involvement, and motivated reasoning. Annals of the International Communication Association, 43(1), 1-23. https://doi.org/10.1080/23808985.2018.1564881
- * Druckman, J.N. & McGrath, M.C. (2019). The evidence for motivated reasoning in climate change preference formation. Nature Climate Change, 9(2), 111-119. https://doi.org/10.1038/s41558-018-0360-1
- Ebbs, S. (2020, March 19). Trump announces potential 'game changer' on drugs to treat novel coronavirus, but FDA says more study is needed. ABC News. https://abcnews.go.com/Politics/trump-announces-potential-game-changer-drugs-treat-covid19/story?id=69693560 (Internet Archive)
- Fallis, Don (2015). What is disinformation? Library Trends, 63(3), 401-426. https://doi.org/10.1353/lib.2015.0014
- Festinger, L. (1957). A theory of cognitive dissonance. Stanford University Press.
- Floridi, L. (2011). The philosophy of information. Oxford University Press.
- Fox, C.J. (1983). Information and misinformation: an investigation of the notions of information, misinformation, informing, and misinforming. Greenwood Press.
- * Freiling, I., Krause, N.M., Scheufele, D.A. & Brossard, D. (in press). Believing and sharing misinformation, fact-checks, and accurate information on social media: the role of anxiety during COVID-19. New Media & Society. https://doi.org/10.1177/14614448211011451
- Furner, J. (2004). Conceptual analysis: a method for understanding information as evidence, and evidence as information. Archival Science, 4(3-4), 233-265. https://doi.org/10.1007/s10502-005-2594-8
- Guess, A., Nyhan, B. & Reifler, J. (2018). Selective exposure to misinformation: evidence from the consumption of fake news during the 2016 U.S. presidential campaign. European Research Council. https://about.fb.com/wp-content/uploads/2018/01/fake-news-2016.pdf. (Internet Archive)
- * Hornsey, M.J. (2020). Why facts are not enough: understanding and managing the motivated rejection of science. Current Directions in Psychological Science, 29(6), pp. 583-591. https://doi.org/10.1177/0963721420969364
- * Kahan, D., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732-735. https://doi.org/10.1038/nclimate1547
- Karlova, N.A. & Fisher, K.E. (2013). A social diffusion model of misinformation and disinformation for understanding human information behaviour. Information Research, 18(1), paper573. http://informationr.net/ir/18-1/paper573.html. (Internet Archive)
- * Kata, A. (2010). A postmodern Pandora's box: anti-vaccination misinformation on the Internet. Vaccine, 28(7), 1709-1716. https://doi.org/10.1016/j.vaccine.2009.12.022
- * Kim, J-N. & Grunig, J.E. (2021). Lost in informational paradise: cognitive arrest to epistemic inertia in problem solving. American Behavioral Scientist, 65(2), 213-242. https://doi.org/10.1177/0002764219878237
- * Krishna, A. (2017). Motivation with misinformation: conceptualizing lacuna individuals and publics as knowledge-deficient, issue-negative activists. Journal of Public Relations Research, 29(4), 176-193. https://doi.org/10.1080/1062726X.2017.1363047
- * Krishna, A. & Thompson, T.L. (2021). Misinformation about health: a review of health communication and misinformation scholarship. American Behavioral Scientist, 65(2), 316-332. https://doi.org/10.1177/0002764219878223
- Kruglanski, A.W. (1989). Lay epistemics and human knowledge: cognitive and motivational bases. Plenum Press.
- Kuklinski, J.H., Quirk, P.J., Jerit, J., Schwieder, D. & Rich, R.F. (2000). Misinformation and the currency of democratic citizenship. The Journal of Politics, 62(3), 790-816. https://doi.org/10.1111/0022-3816.00033
- * Kunda Z. (1990) The case for motivated reasoning. Psychological Bulletin, 108(3), 480-498. https://doi.org/10.1037/0033-2909.108.3.480
- * Lewandowsky, S., Ecker, U.K.H., Seifert, C. M., Schwarz, N. & Cook, J. (2012). Misinformation and its correction: continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131. https://doi.org/10.1177/1529100612451018
- Li, H.O-Y., Bailey, A., Huynh, D. & Chan J. (2020). YouTube as a source of information on COVID-19: a pandemic of misinformation? BMJ Global Health, 5(5), paper e002604 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7228483/ https://doi.org/10.1136/bmjgh-2020-002604
- Lincoln, Y.S., & Guba, E.G. (1985). Naturalistic inquiry. Sage Publications.
- * Lodge, M. & Taber, C. (2000). Three steps toward a theory of motivated political reasoning. In A. Lupia Jr., M.D. McCubbins & S.L. Popkin (Eds), Elements of reason (pp. 183-213). Cambridge University Press.
- Narayan, B. & Preljevic, M. (2017). An information behaviour approach to conspiracy theories: listening in on voices from within the vaccination debate. Information Research, 22(1), paper colis1616 http://InformationR.net/ir/22-1/colis/colis1616.html (Internet Archive)
- * Nir, L. (2011). Motivated reasoning and public opinion perception. Public Opinion Quarterly, 75(3),504-532. https://doi.org/10.1093/poq/nfq076
- * Pennycook, G. & Rand, D.G. (2019). Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39-50. https://doi.org/10.1016/j.cognition.2018.06.011
- * Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25(5), 388-402. https://doi.org/10.1016/j.tics.2021.02.007
- * Pennycook, G., McPhetres, J., Zhang, Y., Lu, J.G. & Rand, D.G. (2020). Fighting COVID-19 misinformation on social media: experimental evidence for a scalable accuracy-nudge intervention. Psychological Science, 31(7), 770-780. https://doi.org/10.1177/0956797620939054
- * Peterson, E. & Iyengar, S. (2021). Partisan gaps in political information and information‐seeking behavior: motivated reasoning or cheerleading? American Journal of Political Science, 65(1), 133-147. https://doi.org/10.1111/ajps.12535
- * Pickles, K., Cvejic, E., Nickel, B., Copp, T., Bonner, C., Leask, J., Ayre, J., Batcup, C., Cornell, S., Dakin, T., Dodd, R.H. & Isautier, J.M.J. & McCaffery, K.J. (2021). COVID-19 misinformation trends in Australia: prospective longitudinal national survey. Journal of Medical Internet Research, 23(1), paper e23805 https://www.jmir.org/2021/1/e23805/ https://doi.org/10.2196/23805
- * Pierre, J.M. (2019). The psychology of guns: risk, fear, and motivated reasoning. Palgrave Communications, 5, article 159. https://www.nature.com/articles/s41599-019-0373-z.pdf. https://doi.org/10.1057/s41599-019-0373-z
- * Redlawsk, D.P. (2002). Hot cognition or cool consideration? Testing the effects of motivated reasoning on political decision making. Journal of Politics, 64(4), 1021-1044. https://doi.org/10.1111/1468-2508.00161
- * Redlawsk, D.P., Civettini, A.J.W. & Emmerson, K.E. (2010). The affective tipping point: do motivated reasoners ever “get it”? Political Psychology, 31(4), 563-593. https://doi.org/10.1111/j.1467-9221.2010.00772.x
- * Richey, M. (2012). Motivated reasoning in political information processing: the death knell of deliberative democracy? Philosophy of the Social Sciences, 42(4), 511-542. https://doi.org/10.1177/0048393111430761
- * Ross, R.M., Rand, D.G. & Pennycook, G. (2021). Beyond "fake news": analytic thinking and the detection of false and hyperpartisan news headlines. Judgment & Decision Making, 16(2), 484-504. https://doi.org/10.31234/osf.io/cgsx6
- Ruokolainen, H. & Widén, G. (2020). Conceptualising misinformation in the context of asylum seekers. Information Processing & Management, 57(3). https://doi.org/10.1016/j.ipm.2019.102127
- Saiful Islam, M., Sarkar, T., Khan, S.H., Kamal, A.H.M., Hasan, S.M., Kabir, A., Yeasmin, D., Ariful Islam, M., Chowdhury, K.I.A., Anwar, K.S., Chughtai, A.A. & Seale, H. (2020). COVID-19-related infodemic and its impact on public health: a global social media analysis. The American Journal of Tropical Medicine and Hygiene, 103(4), 1621-1629. https://www.ajtmh.org/view/journals/tpmd/103/4/article-p1621.xml https://doi.org/10.4269/ajtmh.20-0812
- * Schaffner, B.F. & Roche, C. (2017). Misinformation and motivated reasoning: responses to economic news in a politicized environment. Public Opinion Quarterly, 81(1), 86-110. https://doi.org/10.1093/poq/nfw043
- Søe, S.O. (2018). Algorithmic detection of misinformation and disinformation: Gricean perspectives. Journal of Documentation, 74(2), 309-332. https://doi.org/10.1108/JD-05-2017-0075
- Søe, S.O. (2021). A unified account of information, misinformation, and disinformation. Synthese, 198, 5929-5949. https://doi.org/10.1007/s11229-019-02444-x
- * Soveri, A., Karlsson, L.C., Antfolk, J., Lindfelt, M. & Lewandowsky, S. (2021). Unwillingness to engage in behaviors that protect against COVID-19: the role of conspiracy beliefs, trust, and endorsement of complementary and alternative medicine. BMC Public Health, 21, article no. 684. https://bmcpublichealth.biomedcentral.com/articles/10.1186/s12889-021-10643-w https://doi.org/10.1186/s12889-021-10643-w
- Stier, S., Kirkizh, N., Froio, C. & Schroeder, R. (2020). Populist attitudes and selective exposure to online news: a cross-country analysis combining Web tracking and surveys. The International Journal of Press/Politics, 25(3), 426-446. https://doi.org/10.1177/1940161220907018
- * Su, Y. (2021). It doesn't take a village to fall for misinformation: social media use, discussion heterogeneity preference, worry of the virus, faith in scientists, and COVID-19-related misinformation beliefs. Telematics and Informatics, 58, paper 101547. https://doi.org/10.1016/j.tele.2020.101547
- * Sui, Y, & Zhang, B. (2021). Determinants of the perceived credibility of rebuttals concerning health misinformation. International Journal of Environmental Research and Public Health, 18(3), 1-17. https://doi.org/10.3390/ijerph18031345
- * Swire‐Thompson, B. & Lazer, D. (2020). Public health and online misinformation: challenges and recommendations. Annual Review of Public Health, 41(1), 433-451. https://doi.org/10.1146/annurev-publhealth-040119-094127
- * Sylvester, S.M. (2021). COVID‐19 and motivated reasoning: the influence of knowledge on COVID‐related policy and health behavior. Social Science Quarterly, 102(5), 2341-2359. https://doi.org/10.1111/ssqu.12989.
- * Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755-769. https://doi.org/10.1111/j.1540-5907.2006.00214.x
- * Treen, K.M., Williams, H.T.P. & O'Neill, S.J. (2020). Online misinformation about climate change. Wiley Interdisciplinary Reviews: Climate Change, 11(5), paper e665. https://onlinelibrary.wiley.com/doi/full/10.1002/wcc.665. https://doi.org/10.1002/wcc.665.
- * Vraga, E.K. & Bode, L. (2020). Defining misinformation and understanding its bounded nature: using expertise and evidence for describing misinformation. Political Communication, 37(1), 136-144. https://doi.org/10.1080/10584609.2020.1716500
- * Walter, N. & Tukachinsky, R. (2020). A meta-analytic examination of the continued influence of misinformation in the face of correction: how powerful is it, why does it happen, and how to stop it? Communication Research, 47(2), 155-177. https://doi.org/10.1177/0093650219854600
- * Wang, Y., McKee, M., Torbica, A. & Stuckler, D. (2019). Systematic literature review on the spread of health-related misinformation on social media. Social Science & Medicine, 240, paper 112552. https://doi.org/10.1016/j.socscimed.2019.112552
- World Health Organization. (2021). Coronavirus disease (COVID-19) advice for the public: mythbusters. World Health Organization. https://www.who.int/emergencies/diseases/novel-coronavirus-2019/advice-for-public/myth-busters. (Internet Archive)
How to cite this paper
Appendix: The coding scheme
- The nature and constituents of the motivated reasoning approach
- general level characterizations and typologies
- directional goal
- accuracy goal
- cognitive factors
- affective factors
- confirmation bias
- disconfirmation bias
- Source of misinformation (or disinformation)
- fake news
- conspiracy theory
- Domain of activity/interest
- health behaviour (e.g., vaccination)
- politics (e.g., voting)
- Type of activity
- exposure to information
- information selection
- information seeking
- information use/information processing
- information sharing
- Evaluation of the motivated reasoning as a research approach
- Strengths of the approach
- Limitations of the approach