Information Research logotype

Information Research

Vol. 29 No. 2 2024

Reasons to fight: preliminary results on motivations to combat fake news

Wenting Yu and Qing Yan

DOI: https://doi.org/10.47989/ir292596

Abstract

Introduction. To encourage the public to combat online fake news and revalue truth, it is important to explore the factors that affect individual intention to combat fake news.

Method. This study provides answers using survey data from a representative sample collected in the U.S. (N = 804).

Analysis. We examined the impacts of planned-behaviour-theory components and prior experience of being deceived by fake news on the intentions of news verification, fake news refutation, and fact-checks sharing, with demographical characteristics, media use, and media credibility under control. The study also examined prior experience as a moderator in the models.

Results. Results showed that subjective norms and prior experience of being deceived by fake news were positively correlated with intentions of all three behaviours that help to combat fake news. Prior experience moderated the effect of subjective norms on fake news refutation, and the effect of perceived control on fact-checks sharing.

Conclusion. The findings of this study help scholars and industry practitioners to understand audiences’ interaction with online information and what drives audiences to combat information fakeness. Prior experience of being deceived by fake news is a significant driver.

Introduction

Reading news online has become a major trend. A poll showed more than eight-in-ten (86%) Americans get news from digital devices (Shearer, 2021). However, the professionalism of online news is not guaranteed, resulting in the dissemination of fake news, which refers to false information in a news format (Allcott and Gentzkow, 2017; Tandoc et al., 2018). Online news’ pursuit of timeliness and popularity facilitates the production and spread of fake news (McManus, 2009). In addition, there are websites or social bots that are professional in misleading people and amplifying contradictions to increase revenues (Mourão and Robertson, 2019).

Fake news can cause harmful perceptions and behaviours in society. The dissemination of fake news is an indifference to truth, which eliminates social trust and undermines democracy in the long run (MacKenzie and Bhatt, 2020). Having well-informed decision-makers is essential to democratic societies, while misleading information is damaging the information system (Kuklinski et al., 2000). Empirical studies have confirmed that the spread of falsehood is highly correlated with homophily, polarization, and decreasing social trust and political participation (Bessi et al., 2015; Einstein and Glick, 2015; Lewandowsky et al., 2017; Van Prooijen and Douglas, 2017).

Given the negative influence of fake news and the difficulties in wiping out fake news, more and more experts have started to study individual efforts in combating fake news (Kim et al., 2019; Tang et al., 2022; Zhao et al., 2016), the behaviours or strategies that help to stop the spread of fake news and reduce the impact of fake news. There is an expectation that if audiences can recognise, resist, and react to fake news, the viral spread of fake news can be restrained (Demartini, 2019; Fowler, 2020; Ribertson, 2019). However, few studies have systematically investigated and compared audiences’ behaviours in combating fake news. To fill this gap, this study investigated the motivations of three behaviours that help to stop the dissemination of fake news: news verification, fake news refutation, and fact-checks sharing, and compared their differences.

To understand the psychological mechanism behind individual choice in combating fake news, we adopted the well-examined framework proposed by the theory of planned behaviour (TPB). The TPB suggests that behavioural intention is driven by three factors regarding that behaviour: (a) attitudes, (b) subjective norms, and (c) perceived behavioural control (Ajzen, 1985, 1991). This study further added prior experience of being deceived by fake news to the model as a moderator. By studying individual factors that may affect the intention of combating fake news, this study provides empirical evidence for future interventions that can help promote individuals combat fake news.

In summary, this study first conceptualised three behaviours that help to stop the dissemination of fake news (i.e., news verification, fake news refutation, and fact-checks sharing), and then examined how the intentions of these three behaviours are influenced by attitudes, subjective norms, perceived behavioural control, and personal experience with fake news, with survey data collected in the United States. The implications of the findings are discussed.

Literature review

The individual combat against fake news

In this changing media environment, audiences have become important news guardians and fake news stoppers (Pearson and Kosicki, 2017; Thorson and Wells, 2015). Audiences are the gatekeepers of online news information. If they can identify fake news and act on it, the viral dissemination of fake news can be slowed down.

Fortunately, the interacting online environment not only helps the dissemination of fake news but also provides opportunities for audiences to oppose fake news. An increasing number of audiences actively disseminate truth against fake news online, taking the role of news guardians (Vo and Lee, 2018). To promote the value of truth in online news dissemination, it is important to understand how audiences interact with news and fake news. This study has reviewed more than 50 interdisciplinary literature on individual engagement in combating fake news (e.g., Amazeen et al., 2019; Schwarzenegger, 2020; Shin and Thorson, 2017; Tandoc et al., 2018, 2020; Vraga et al., 2020), and identified three types of individual behaviours that can help combat fake news: verifying news (Edgerly et al., 2020), fake news refutation (Tandoc et al., 2020), and fact-checks sharing (Yu et al., 2022).

News verification

News verification refers to an individual’s behaviour of determining whether a news story exists or is true. Society has realised the vital role of news verification at the individual level in combating fake news. For example, journalists have posted online tutorials about detecting fake news (Ferregl, 2020; Wendling, 2017). The application WhatsApp developed a chatbot for users to check message authenticity (Dhawan, 2020). Schools and organisations are offering news literacy education to teach students to verify fake news in the internet era (Tugend, 2020).

An increasing number of academic studies research the intention of news verification behaviours, and they focus on the impact of media-related factors. For example, an experiment found that participants exhibited a higher intention to verify a news headline when they believed it was true, which was predicted by perceived congruency with preexisting ideological leanings (Edgerly et al., 2020). Respondents who had more exposure to fake news tend to verify news more (Müller and Schulz, 2019; Yu, 2021). Vraga and colleagues (2020) suggest that news verification is a behaviour driven by news literacy, which is ‘knowledge of the personal and social processes by which news is produced, distributed, and consumed, and skills that allow users some control over these processes’ (p. 4). Ideally, audiences who have news literacy skills will verify news when they consume it and identify the incorrect information (Vraga et al., 2020).

Fake news refutation

Audiences show various responses to fake news, such as sharing it, reporting it, and correcting it. This study investigates fake news refutation (e.g., reporting fakeness to online platforms), a response that helps to combat fake news. Websites such as Google, Facebook, and Twitter welcome users to report fake news since it helps them spot problematic content (Murray, 2016). Correcting posts containing fake news by commenting is also a positive response that can help to combat fake news. Based on social cognition theory, the given feedback shapes our outcome expectations and thus influences human behaviour (Bandura, 1989). Previous studies found that online feedback (e.g., comments and private conversations) affects receivers’ attitude change (Ayeh et al., 2013; Baber et al., 2016; Hsueh et al., 2015; Lee et al., 2008; Sassenberg and Jonas, 2007). Peers’ responses with fact-check sources succeeded in correcting misperceptions about the causes of the Zika virus on both Facebook and Twitter (Vraga and Bode, 2018).

Fake news refutation can help correct other topically related misbeliefs that are not specifically contained in the fake news (Bode et al., 2020). However, a survey in the U.K. showed that the vast majority (78.8%) of British social media users had not corrected other social media users for sharing problematic news (Chadwick and Vaccari, 2019). The predictors of refutation are understudied. To encourage fake news correction among ordinary people, understanding the motivations behind fake news refutation is urgent.

Fact-checks sharing

Although fake news has a remarkable effect on changing people’s perceptions and attitudes (Chan et al., 2017; Crozier and Strange, 2019; Swire et al., 2017; Wintersieck, 2017), many studies have found that fact-checks can successfully correct misbeliefs (Bode and Vraga, 2018; Hameleers and van der Meer, 2020). A meta-analysis confirmed the positive influence of fact-checking messages (Walter and Murphy, 2018). Moreover, individuals’ fact-checking sharing increases ‘pre-bunking’ exposure among other users—being exposed to fact-checks before fake news (Bode and Vraga, 2021, p. 2). Pre-bunking is found to be effective in inoculating people against misbeliefs (Cook et al., 2017; van der Linden et al., 2017).

There is an overlap between fake news refutation and fact-checks sharing. Some audiences attach the fact-checking message when refuting a fake-news post online. However, the current study still studies fake news refutation and fact-checks sharing separately. Refuting fake news refers to the context where an individual sees fake news and reacts, while sharing fact-checks does not require an individual to know about the fake news beforehand. In the ‘pro-truth pledge’ suggested by Tsipursky (2018) and colleagues, sharing the truth and encouraging others to retract the inaccurate information are two different actions.

In contrast with the rapid spread of fake news, audiences show much less interest in sharing fact-checks, although they perceive support for fact-checks sharing from friends and online communities (Pal et al., 2019). By analysing rumor cascades on Twitter from 2006 to 2017, one study found that fake news can reach more audiences and spreads faster than true news (Vosoughi et al., 2018). Previous studies mostly examined the impacts of information attributes on fact-checks sharing (e.g., rating of the message, source credibility) (Yu et al., 2022). One study examined an element of TPB model in encouraging fact-checks sharing. Pal et al. (2019)—found that a fact-check message that increases perceived norms of fact-checks sharing increases the intention of fact-checks sharing.

Using an extended model of TPB to explain the audience's intention of combating fake news

Scholars suggest that when and why audience interact with news or fake news is even more important than whether they interact or not (Edgerly et al., 2020). To encourage positive acts of combating fake news, understanding the motives is necessary. This study adopted the framework of the extended TPB to explain the motivations behind fake-news combating behaviours.

The TPB is one of the most widely examined theories on predicting human behaviours (Ajzen, 1985, 1991, 2011). TPB is a psychological theory that explains how people’s beliefs influence their intentions and actions. The theory assumes that people act rationally and plan their behaviors. Specifically, the TPB suggests that behaviour is driven by behavioural intention, and the intention is influenced by three factors regarding that behaviour: (a) attitudes, (b) subjective norms, and (c) perceived behavioural control. Behavioural intentions then predict actual behaviour. The model has received positive support from several meta-analyses (Hausenblas et al., 1997; Tyson et al., 2014; Weigel et al., 2014). Given the TPB model includes people’s basic beliefs about the behaviours, and the model has strong predicting power in explaining the behaviours, some studies have adopted the TPB to explain media engagement, such as using online media for campaigning (Marcinkowski and Metag, 2014), co-creating service for commercial industries on social media (Cheung and To, 2016), and social networking behaviour (Pelling and White, 2009).

As combating fake news is an intentional and planned action that depends on the beliefs and perceptions of the individuals, the studied behaviours may be related to components from TPB. In media research, TPB has been used to explained media technology adoption and social networking behaviours, which are actions that can generate positive social impacts (Marcinkowski and Metag, 2014; Pelling and White, 2009). Although fake news refutation and fact-checks sharing is a positive action from the perspective of improving the information environment and guarding the truth, these behaviours can have negative impacts on individuals’ social relationships as well (e.g., by hurting the recipient's self-esteem) (Jun et al., 2017; Tandoc et al., 2020). This study aims to examine if TPB can be used to predict social behaviours that may insert negative influences on social relationships.

Components of TPB

Attitude means an individual’s evaluation of the behaviour, such as the benefits, costs, or effect of performing that behaviour. People who hold a positive attitude toward behaviour are more likely to take action (Ajzen, 1991). Subjective norms refers to an individual’s perception of whether other important people perceive they should engage in the behaviour, which usually has a positive relation with behavioural intention (Ajzen, 1991). Perceived behavioural control describes an individual's perception of the resources, ability, and sense of control they have in successfully performing the behaviour. Perceived behavioural control has been found to demonstrate a consistently positive effect on intentions to perform the investigated behaviour (Ajzen, 1991).

However, the relationships between components of TPB and behavioural intentions are not always supported. For example, perceived behavioural control was not significantly related to social networking behaviour (Pelling and White, 2009). Some criticised the model of TPB for only considering individual beliefs but not accounting for the contextual factors (Yan et al., 2021), as there may be external factors or constraints (e.g., environment, policy, social norms) that limit the person’s ability to perform the behavior, regardless of their intention or belief. 

This study compares how audiences’ general cognitions of combating fake news as an audience predict the intention of news verifying, fake news refutation, and fact-checks sharing. Although scholars have different opinions on the explanatory role of TPB, we hypothesized the followings based on a large amount of empirical evidence (Hausenblas et al., 1997; Tyson et al., 2014; Weigel et al., 2014):

H1: A positive attitude toward combating fake news will be positively related to the intentions of news verification (H1a), fake news refutation (H1b), and fact-checks sharing (H1c).

H2: Subjective norms regarding combating fake news will be positively related to the intentions of news verifying (H2a), fake news refutation (H2b), and fact-checks sharing (H2c).

H3: Perceived behavioural control over combating fake news will be positively related to the intentions of news verifying (H3a), fake news refutation (H3b), and fact-checks sharing (H3c).

Adding prior experience to the TPB model

The influence of prior experience on behavioural choice has been found in studies from different fields (Amador et al., 2013; Wang and Huang, 2021; Xu et al., 2017). It has been argued that the impact of prior experience on current behaviour even outperforms the impact of cognition (Sutton, 1994). Therefore, prior experience has been adopted to extend the TPB in many studies (Ajzen, 2020; Bagozzi and Kimmel, 1995; Lee, 2009; Yao and Linz, 2008), where the association between prior experience and current behaviour has been mostly found to be positive. There are two explanations for the impact of prior experience. One explanation is that people need less information about behaviour when they have previously executed the behaviour, suggesting prior experience works as a source of information (Ajzen, 2002b; Verplanken et al., 1997). Another explanation is that individuals may simply have the intention of following what they have done previously, which means prior experience serves as an automatic motive (Bargh, 1996; Sommer, 2011).

This study is interested in the association between people’s previous experience with fake news and the intention of combating fake news. Prior experience of technology usage was found to be correlated with the adoption of new technology, such as personal computers (Ling et al., 2010; Teo and Lim, 1996). Also, past interaction with social media can predict news sharing on social media (Lee and Ma, 2012). Similarly, previous experience with fake news may motivate further engagement with it, which is supported by the effect of direct experience with the environment on actions in social learning theory (Bandura and Walters, 1977). We posited that prior experience of being deceived by fake news would activate the intention to prevent one or others from being deceived.

H4: Prior experience with fake news will be positively related to the intentions of news verifying (H4a), fake news refutation (H4b), and fact-checks sharing (H4c).

Verplanken and his colleagues (1997) suggest that prior experience can also moderate the effects of TPB components on goal-directed behaviours. Although many audiences can see the harms of fake news and perceive the subjective norms and ability of combating fake news, audiences might not react to it until they have become the victims. Therefore, we posited the following:

H5: Audiences who have a more positive attitude toward combating fake news show more intentions of verifying news (H5a), fake news refutation (H5b), and sharing fact-checks (H5c) if they have more experience of encountering fake news.

H6: Audiences who perceive more subjective norms regarding combating fake news show more intentions of verifying news (H6a), fake news refutation (H6b), and sharing fact-checks (H6c) if they have more experience of encountering fake news.

H7: Audiences who perceive more behavioural control over combating fake news show more intentions of verifying news (H7a), fake news refutation (H7b), and sharing fact-checks (H7c) if they have more experience of encountering fake news.

In summary, we conceptualised three behaviours of combating fake news: news verification, fake news refutation, and fact-checks sharing. As these behaviours are mostly intentional and planned, we adapted the framework of TPB to examine the motives of these behaviours.

Method

In this section, we report the sampling method, data collection procedure, survey flow, sample profile, and measurements of this study. We report the mean value, standard deviation, and scale reliability of each measurement.

Sample

The data was collected in the U.S. through an online survey. Before the formal data collection, a pilot test with 20 American respondents was conducted to test the questionnaire’s clarity and instructions on May 1, 2019. To make the questions more understandable, we modified some wordings based on the feedback.

The formal survey was conducted from 6 -15 May, 2019. Data collection was outsourced to a survey company, CloudResearch. CloudResearch is a data company that has an international panel of diverse backgrounds. The recruitment was targeted at American respondents aged 18–65. Quota sampling was executed based on the age distribution of American internet users from the U.S. Census Bureau (U.S. Census Bureau, 2019). The response rate was 82%. The demographic data of the respondents are shown in Table 1.

To avoid confusion, we explained the definition of news to the respondents at the beginning of the survey: ‘News in this survey refers to information about current news events’. The definition of news followed the questionnaire of the General Social Survey (GSS) (Smith et al., 2018). GSS is a project of the independent research organisation NORC at the University of Chicago, with principal funding from the National Science Foundation, and the data of GSS is widely used in studies of social science. A filter question ‘Have you consumed news from the internet in the past 12 months’ was asked before the survey, those who answered ‘no’ were not allowed to enter the survey.

In total, 826 respondents completed the survey. We excluded 22 responses that finished the survey in less than 5 minutes, which generated a final sample of 804 respondents. The final sample included 26.9% aged between 18 and 29 (n = 216), 43.7% aged between 30 and 49 (n = 352), and the remaining 29.4% aged between 50 and 65 (n = 236). The average age was 40.02 (SD = 13.28). More than half of the respondents were female (n = 463, 57.6%).

Frequency (%) Frequency (%)
Gender Age
Male 337 (41.9%) 18–29 216 (26.9%)
Female 463 (57.6%) 30–49 352 (43.7 %)
Other 4 (0.5%) 50–65

236

(29.4%)

Education Ethnicity
Less than high school 4 (0.5%) White/Caucasian 614 (76.4%)
High school/GED 206 (25.6%) Black or African-American 82 (10.2%)
Two-year college degree 115 (14.3%) American Indian or Alaska Native 14 (1.7%)
Four-year college degree 337 (41.9%) Asian/Pacific Islander 62 (7.7%)
Master’s degree or above 142 (17.7%) Multi-racial 19 (2.4%)
Other 13 (1.6%)

Table 1. Demographic profile of respondents (N = 804)

Measurements

The intention of news verification. We followed Edgerly et al. (2020) to measure the intention of news verification. The items are shown in Table 2. Since this study focused on online fake news, participants were asked how likely they would perform the listed behaviours when they read online information about current news events using a 7-point Likert scale, where 1 signified ‘extremely unlikely’ and 7 signified ‘extremely likely’. The average scores of the items formed the index of news verification (M = 4.66, SD = 1.26, α = .73).

M SD α
News verification 4.66 1.26 .73
Check other major news outlets 5.35 1.67
Ask friends/family members 4.28 1.86
Use a search engine 5.40 1.60
Check social media (e.g., Facebook, Twitter) 3.84 2.08
Consult some other sources 4.43 1.85
Fake news refutation 3.56 1.64 .85
Tell the poster by sending private messages 3.46 1.96
Tell the poster by leaving comments 3.91 1.98
Report it to the platform 3.80 1.99
Forward it and announce it as inaccurate information 3.97 1.99
Fact-checks sharing 4.05 1.74 .73
Share it on social media (e.g., Twitter) 3.52 2.03
Tell people I know, like my friends and family by private contact 4.59 1.88

Table 2. Measuring the intentions of news verification, fake news refutation, and fact-checks sharing (N = 804)

The intention of fake news refutation. The measurement of fake news refutation was adapted from Tandoc et al.’s (2020) study, and the items are listed in Table 2. Respondents were asked how likely they would do the listed behaviours when they read inaccurate online information about current news events. A 7-point Likert scale was used for rating, where 1 signified ‘extremely unlikely’ and 7 signified ‘extremely likely’. The average scores of the items formed the index of fake news refutation (M = 3.56, SD = 1.64, α = .85).

The intention of fact-checks sharing. For the measurement of fact-checks sharing, respondents indicated how likely they were to ‘Share it on online platforms (e.g., Facebook, Twitter)’ and ‘Tell people I know, like my friends and families’ when they see fact-checking messages on the internet. A 7-point Likert scale was used for rating, where 1 signified ‘extremely unlikely’ and 7 signified ‘extremely likely’. The average scores of the items formed the index of fact-checks sharing (M = 4.05, SD = 1.74, α = .73).

Components of the TPB. The measures of subjective norms and perceived behavioural control were adapted from Ajzen’s (Ajzen, 1991, 2002a) and Taylor and Todd's (1995) studies. The average scores of the items formed the index of each scale. The items measuring subjective norms (M = 5.17, SD = 1.33, α = .85) were ‘People who influence my behaviour would think that I should combat inaccurate online information about current news events’ and ‘People who are important to me would think that I should combat inaccurate online information about current news events when I read them’. The items measuring perceived behavioural control (M = 5.71, SD = 1.06, α = .91) were ‘I would be able to combat fake news’, ‘I have the resources and the ability to combat inaccurate online information about current news events’, and ‘Combating inaccurate online information about current news events is entirely within my control’. For the measures of subjective norms and perceived behavioural control, participants were asked to rate the statements on a 7-point Likert scale, where 1 meant ‘strongly disagree’ and 7 meant ‘strongly agree’. For the attitude toward combating fake news, respondents were asked to what extent they thought combating inaccurate online information about current news events was ‘foolish/wise’, ‘not rewarding/rewarding’, and ‘not valuable/valuable’, on bipolar adjective 7-point scales. Using bipolar adjective scales to measure attitude can capture both the direction and intensity of the attitude, as well as the balance between two opposing qualities (Fishman et al., 2021). The higher scores indicated a more positive attitude toward combating fake news (M = 5.95, SD = 1.09, α = .89).

Prior experience of being deceived by fake news. Adapted from the scale of prior exposure (Southwell and Torresast, 2006), respondents were asked to rate two items on a 5-point Likert scale, from 1 ‘never’ to 5 ‘all the time’. The items used to measure the prior experience of being deceived by fake news are (1)’Believed a message about current news events, which turned out to be untrue later’ and (2)’Posted/forwarded a message about current news events, which turned out to be untrue later’. The average scores of these two items formed the index of the scale (M = 2.33, SD = 0.85, α = .98).

Control variables. Age, gender, education, race, party identification, online news consumption, trust in mass media, trust in social media, and online interaction were under control. Respondents were asked to report their age with an exact number, M = 40.02, SD = 13.28. Respondents indicated their gender, race and highest education level from the given options (see Table 1). The frequency of gender, education, and ethnicity are listed in Table 1. For party identification, we followed previous studies (Chatard et al., 2010; Kappe and Schuter, 2022) and asked respondents: ‘We hear a lot of talk these days about liberals and conservatives. Where would you place yourself on the conservative-liberal scale?’ Respondents rated their identification from 0 ‘extremely conservative’ to 10 ‘extremely liberal’, M = 6.19, SD = 2.68. Individual behaviours of combating fake news involve online news use and interaction behaviours. For example, audiences who do not leave online comments at all will not leave a comment to tell that the post is fake news; therefore, we also measured online news consumption, trust in mass media, trust in social media, and online interaction behaviours as control variables. For online news consumption, we followed GSS’s survey in 2018 (Smith et al., 2018) and asked ‘How often do you get information about current news events from the internet in the past 12 months?’ Respondents rated the items on a 7-point Likert scale, from 1 ‘never’ to 7 ‘all the time’, M = 4.66, SD = 0.68. The measurement of media trust followed American National Election Studies (ANES) (ANES, 2018), asking respondents to rate their general trust in mass media and social media to report the news fully, accurately, and fairly, from 1 ‘none’ to 5 ‘a great deal’. In general, respondents have more trust in news on mass media (M = 3.03, SD = 0.68) than in social media (M = 2.34, SD = 1.08). For online interaction, we asked how often respondents do the followings: to share information on the internet (e.g., on online forums, Instagram, Facebook, Twitter), to comment on a post/news article on the internet (e.g., on news websites, online forums, Instagram, Facebook, Twitter), and to chat with others on the internet (e.g., on WhatsApp, Facebook). The average scores form the index of online interaction (M = 2.86, SD = 1.13, α = .69).

Results

The mean scores of the measured behavioural intentions are listed in Table 2. In general, respondents were more likely to verify than respond to fake news and share fact-checks. Among the measured behaviours of news verification, checking other major news outlets and using a search engine were the two most popular methods. As for fake news refutation, respondents preferred to forward it and announce it as inaccurate information, followed by telling the poster by leaving comments. Compared to sharing fact-checks on an online platform, respondents were more likely to share fact-checks privately with people they know.

OLS regression models were constructed in SPSS 24 to analyse the data. Standardised coefficients and R-square values of each hierarchical model were reported to indicate effect size. The effects of the control variables on three dependent variables are shown in Table 3. Males were less likely to share fact-checks than females (b = -.08, p < .01). Younger adults were more likely to verify news than older adults (b = -.09, p < .01). Respondents who consumed more online news tended to verify news more (b = .14, p < .001). Trust in social media was positively correlated with the intentions of news verifying (b = .16, p < .001), fake news refutation (b = .15, p < .001), and fact-checks sharing (b = .12, p < .001). Online interaction was also positively correlated with the intentions of news verifying (b = .27, p < .001), fake news refutation (b = .45, p < .001), and fact-checks sharing (b = .46, p < .001). The intentions of news verifying, fake news refutation, and fact-checks sharing did not vary on education, race, political party, and trust in mass media.

News verification  Fake news refutation Fact-checks sharing
Gender -.04 -.06 -.08**
Age -.09** -.01 .01
Education -.03 -.09 .02
Black or African-American -.01 .01 .02
American Indian or Alaska Native -.03 -.03 -.03
Asian/Pacific Islander -.00 -.03 .02
Multi-racial -.01 .01 .00
Party .02 -.01 -.01
Online news consumption .14*** -.05 .01
Trust in mass media .05 .05 -.04
Trust in social media .16*** .15*** .12**
Online interaction .27*** .45*** .46***
R2 21.8%*** 31.3%*** 30.0%***

Note: * p < .05. **p < .01, ***p < .001.
The table reports standardised coefficient beta.
The listed race groups were compared to ‘White/Caucasian’.
Table 3. Predicting the intentions of news verification, fake news refutation, and fact-checks sharing with control variables (N = 804)

The effects of TPB components on the intentions of news verifying, fake news refutation, and fact-checks sharing with the extended TPB model were examined with hierarchical regression models, with the effects of control variables under control. The results are listed in Table 4. For the parsimony of the reported table, the effects of the variables in Table 3 were not reported here. H1 posited that a positive attitude toward the combat against fake news would be positively related to the intentions of news verification (H1a), fake news refutation (H1b), and fact-checks sharing (H1c). Only H1a was supported by data (b = .17, p < .001).

News verification  Fake news refutation Fact-checks sharing
Attitude

.17***

(28.0%***)

-.01

(31.8%*)

.05

(30.8%**)

Subjective norms

.24***

(32.2%***)

.19**

(34.8%***)

.12***

(32.2%***)

Perceived behavioural control

-.01

(32.6%)

.05

(33.2%)

.06

(32.3%*)

Prior experience

.08*

(32.6%***)

.22***

(38.0%***)

.21***

(35.0%***)

PE*ADa

-.02

(32.7%)

-.02

(38.0%)

.04

(35.1%)

PE*SNb

.01

(32.6%)

.07*

(38.4%*)

.05

(35.3%)

PE*PBCc

.04

(32.6%)

.02

(38.0%)

.10**

(35.8**%)

Note: * p < .05. **p < .01, ***p < .001.
Each interaction term entered the model separately. The table reports standardised coefficient beta, with model R-square in parentheses.
The interaction terms have been centered.
a ‘PE*AD’ refers to the interaction term of prior experience and attitude, b ‘PE*SN’ refers to the interaction term of prior experience and subjective norms, c ‘PE*PBCc’ refers to the interaction term of prior experience and perceived behavioural control.
Table 4. Predicting the intentions of news verification, fake news refutation, and fact-checks sharing with the extended TPB model (N = 804)

The data supported H2a-c that subjective norms of combating fake news led to more intentions of news verifying (b = .24, p < .001), fake news refutation (b = .19, p < .01), and fact-checks sharing (b = .12, p < .001).

H3a-c posited that perceived behavioural control is positively correlated with intentions of combating fake news, which were not supported. The level of perceived behavioural control was not significantly related to the intentions of news verifying, fake news refutation, and fact-checks sharing.

H4a-c suggested that prior experience of being deceived by fake news was positively correlated with intentions of combating fake news, which were supported by data. People who perceived more prior experience of being deceived by fake news were more willing to verify news (b = .08, p < .05), respond to fake news (b = .22, p < .001), and share fact-checks (b = .21, p < .001).

Then we proceeded to examining H5–H7, which hypothesized the moderation effects of prior experience and the components of TPB on news verification, fake news refutation, and fact-checks sharing. H6b was supported, that prior experience of being deceived by fake news moderated the effect of subjective norms on the intention of fake news refutation (b = .07, p < .05). The moderation effect is shown in Figure 1. Compared to audiences who perceive a lower level of subjective norms regarding combating fake news, audiences who perceive a higher level of subjective norms showed more intention to respond to fake news as they encountered more fake news. In addition, H7c was supported:prior experience of being deceived by fake news moderated the effect of perceived behavioural control on the intention of fact-checks sharing (b = .10, p < .01). Figure 2 shows that audiences who perceive a lower level of control over combating fake news were less willing to respond to fact-checking messages as they encountered more fake news; while those who perceive a higher level of control had more intention to respond to fact-checking messages as prior experience of encountering fake news increased.

Figure 1. The interaction effect of prior experience and subjective norms on the intention of fake news refutation

Figure 2. The interaction effect of prior experience and perceived behavioural control on the intention of fact-checks sharing

In general, three components of the extended TPB showed consistent patterns in their relationships with the intention of news verification, fake news refutation, and fact-checks sharing). Subjective norms and prior experience of being deceived by fake news were positively correlated with all three behaviours. Perceived behavioural control was not significantly correlated with any of the three behaviours. Attitude was only significantly correlated with news verification.

Discussions and conclusion

Using a representative U.S. sample, this study investigated and compared the predictors of three behaviours combating fake news: news verification, fake news refutation, and fact-checks sharing. The study helps scholars and industry practitioners to understand audiences’ interaction with fake news, and what drives audiences to pursue truth rather than fakeness in the hybrid media environment.

First, it should be noted that audiences were not very enthusiastic about combating fake news, which is similar to the results of investigations in other regions in the world, such as Singapore (Tandoc et al., 2020) and the U.K. (Chadwick and Vaccari, 2019). Meanwhile, only 6.3% of respondents (n = 51) reported never having believed fake news previously, and only 37.1% (n = 298) had said they never posted or forwarded fake news previously. Therefore, understanding the motivations of combating fake news is needed.

As suggested by the model of TPB (Ajzen, 1985), subjective norms were significantly and positively related to intentions of all three fake-news combating behaviours. The finding in this survey echoes what Pal et al. (2019) found in their experiment: fact-checks that increased perceived norms of fact-checks sharing motivates sharing behaviour. Therefore, to cultivate news guardians, one thing that governments, NGOs, and schools can do is to promote news literacy education and cultivate a consensus of combating fake news. Fortunately, some of these groups have already taken action in the area of news literacy education (Tugend, 2020).

Attitude towards combating fake news was positively correlated with news verification, but not the other two behaviours: fake news refutation and fact-checks sharing. The results may imply the difference in the natures of the three behaviours. Fake news refutation and fact-checks sharing means the individuals need to inform others that they made a mistake, which may cause negative impacts on one’s social relationships (Jun et al., 2017; Tandoc et al., 2020). Compared to fake news refutation and fact-checks sharing, news verification is less likely to cause negative feelings and reactions from others. In other words, when the behaviour may cause negative influences, individual attitudes towards the behaviour has less power in explaining behavioural intention.

Perceived behavioural control was not significantly related to three combating behaviours. One explanation is that, in a country with a high internet penetration rate as the U.S, most of the respondents have the abilities to conduct the three investigated behaviours; thus, the reason for a low frequency of fake-news combating among individuals does not lie in ability. The finding implies that in countries where individuals have high internet skills, policy makers do not need to focus on teaching individuals how to combat fake news, but to provide incentives for the behaviours, for example, recognition by local or online communities.

Fifth, individuals who had prior experience of being deceived by fake news were more likely to verify news, respond to fake news and share fact-checks. The findings further confirm that prior experience plays an important role in predicting behavioural intentions with TPB components (Ajzen, 2020). The findings have implications for the internet industry and information managers. Internet companies should consider allowing users to tag or review browsed messages in an easier way, which can help users better store and compare online information. In addition, social media platforms and fact-checkers should promote fact-checks to groups that encountered or were influenced by fake news more frequently, such as those who use alternative media sources and those who have strong preexisting opinions (Scherer and Pennycook, 2020; Wang and Yu, 2022).

Prior experience moderated the effect of subjective norms on fake news refutation. Those who perceived a higher level of subjective norms were more likely to respond to fake news as they had prior experience of being deceived by fake news, compared to those who perceived a lower level of subjective norms. Prior experience also moderated the effect of perceived control on fact-checks sharing. Those who had more prior experience of being deceived by fake news and perceived a higher level of control over combating fake news were more likely to share fact-checks, while those perceived a lower level of control were less likely to share fact-checks when they had more prior experience of being deceived by fake news. Therefore, social norms of combating fake news and the intervention of advancing skills of combating fake news should be promoted, especially among those who are more likely to encounter or fall for fake news.

An interesting finding is that online interaction was positively correlated with intentions of news verification, fake news refutation, and fact-checks sharing. In addition, similar to previous studies on credibility and intentions of combating fake news (Edgerly et al., 2020; Kim et al., 2019), respondents who have more trust in social media were more likely to combat online fake news. In other words, actions of combating fake news does not necessarily relate to distrust and disuse. Internet companies should not worry about a loss of reputation when users are combating online fake news; instead, to further foster trust and user loyalty, they should provide more channels and assistance to the active and allegiant users in combating fake news.

It is important to point out the limitations of this study and the implications for future studies. First, to investigate and compare people’s intentions to combat fake news in daily news reading, this study used self-reported data related to ‘inaccurate news’, while it is hard to tell whether respondents can correctly identify inaccurate news. Second, as this study only used cross-sectional data, the study could not establish causalities between variables. Future studies can further examine the motivations of real behaviors (i.e., news verification, fake news refutation, and fact-checks sharing) by experiments.

In conclusion, not all the components of the extended TPB had a strong relationship with behavioural intention of combating fake news. The findings showed that subjective norms and prior experience with fake news were two important factors that drive the behaviours of combating fake news. Perceived behavioural control, however, was not related to the intention of three combating behaviours. One possible reason is that adopting the three combating behaviours is not very challenging for internet users. Positive attitudes towards combating fake news was only positively related to news verification, a behaviour that does not necessarily involve social interactions or cause negative impacts on social relationships. Therefore, we suggest that when studying interactive behaviours, especially behaviours that may bring negative feelings to others, such as fake news refutation and fact-checks sharing, researchers should take social relationships and the interactive environment into consideration.

Acknowledge

The project is supported by The Fundamental Research Funds for Central Universities (23NJYH10).

About the authors

Wenting Yu (PhD) is an Assistant Professor at the Department of Chinese and Bilingual Studies, The Hong Kong Polytechnic University. She is interested in political communication, health communication, and social media communication. wenting.yu@polyu.edu.hk

Qing Yan (PhD) is a Professor in the School of Journalism and Communication at Jinan University. His research focuses on entertainment and media culture especially in the context of new media. yanqing2008@163.com

References

Ajzen, I. (1985). From intentions to actions: a theory of planned behavior. In J. Kuhl & J. Beckmann (Eds.), Action Control: From Cognition to Behavior (pp. 11-39). Springer. https://doi.org/10.1007/978-3-642-69746-3_2

Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179-211. https://doi.org/10.1016/0749-5978(91)90020-T

Ajzen, I. (2002a). Perceived behavioral control, self‐efficacy, locus of control, and the theory of planned behavior. Journal of Applied Social Psychology, 32(4), 665-683. https://doi.org/10.1111/j.1559-1816.2002.tb00236.x

Ajzen, I. (2002b). Residual effects of past on later behavior: Habituation and reasoned action perspectives. Personality and Social Psychology Review, 6(2), 107-122. https://doi.org/10.1111/j.1559-1816.2002.tb00236.x

Ajzen, I. (2011). The theory of planned behaviour: reactions and reflections. Psychology & Health, 26, 1113-1127. https://doi.org/10.1080/08870446.2011.613995

Ajzen, I. (2020). The theory of planned behavior: Frequently asked questions. Human Behavior and Emerging Technologies, 2(4), 314-324. https://doi.org/10.1002/hbe2.195

Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-236. https://doi.org/10.1257/jep.31.2.211

Amador, F. J., González, R. M., & Ramos-Real, F. J. (2013). Supplier choice and WTP for electricity attributes in an emerging market: the role of perceived past experience, environmental concern and energy saving behavior. Energy Economics, 40, 953-966. https://doi.org/10.1016/j.eneco.2013.06.007

Amazeen, M. A., Vargo, C. J., & Hopp, T. (2019). Reinforcing attitudes in a gatewatching news era: Individual-level antecedents to sharing fact-checks on social media. Communication Monographs, 86(1), 112-132. https://doi.org/10.1080/03637751.2018.1521984

ANES. (2018). The American National Election Studies. https://electionstudies.org/ (Internet Archive)

Ayeh, J. K., Au, N., & Law, R. (2013). “Do we believe in TripAdvisor?” Examining credibility perceptions and online travelers’ attitude toward using user-generated content. Journal of Travel Research, 52(4), 437-452. https://doi.org/10.1177/0047287512475217

Baber, A., Thurasamy, R., Malik, M. I., Sadiq, B., Islam, S., & Sajjad, M. (2016). Online word-of-mouth antecedents, attitude and intention-to-purchase electronic products in Pakistan. Telematics and Informatics, 33(2), 388-400. https://doi.org/10.1016/j.tele.2015.09.004

Bagozzi, R. P., & Kimmel, S. K. (1995). A comparison of leading theories for the prediction of goal‐directed behaviours. British Journal of Social Psychology, 34(4), 437-461. https://doi.org/10.1111/j.2044-8309.1995.tb01076.x

Bandura, A., & Walters, R. H. (1977). Social learning theory (Vol. 1): Englewood cliffs Prentice Hall. https://doi.org/10.4135/9781412959193.n17

Bargh, J. A. (1996). Automaticity in action: the unconscious as repository of chronic goals and motives. In P. Gollwitzer & J. A. Bargh (Eds.), The Psychology of Action: Linking Cognition and Motivation to Behavior (p.457-481). The Guilford Press.

Bessi, A., Coletto, M., Davidescu, G. A., Scala, A., Caldarelli, G., & Quattrociocchi, W. (2015). Science vs conspiracy:collective narratives in the age of misinformation. PLoS One, 10(2), e0118093. https://doi.org/10.1371/journal.pone.0118093

Bode, L., & Vraga, E. K. (2018). See something, say something: Correction of global health misinformation on social media. Health Communication, 33(9), 1131-1140. https://doi.org/10.1080/10410236.2017.1331312

Bode, L., & Vraga, E. K. (2021). Correction Experiences on Social Media During COVID-19. Social Media + Society, 7(2), 20563051211008829. https://doi.org/10.1177/20563051211008829

Bode, L., Vraga, E. K., & Tully, M. (2020). Do the right thing: tone may not affect correction of misinformation on social media. Harvard Kennedy School Misinformation Review, 1(4), 1-12. https://doi.org/10.37016/mr-2020-026

Chadwick, A., & Vaccari, C. (2019). News sharing on UK social media: misinformation, disinformation, and correction. Survey Report. https://repository.lboro.ac.uk/articles/News_sharing_on_UK_social_media_misinf ormation_disinformation_and_correction/9471269

Chan, M.-p. S., Jones, C. R., Hall Jamieson, K., & Albarracín, D. (2017). Debunking: a meta-analysis of the psychological efficacy of messages countering misinformation. Psychological Science, 28(11), 1531-1546. https://doi.org/10.1177/0956797617714579

Chatard, A., Arndt, J., & Pyszczynski, T. (2010). Loss shapes political views? terror management, political ideology, and the death of close others. Basic and Applied Social Psychology, 32(1), 2-7. https://doi.org/10.1080/01973530903539713

Cheung, M. F., & To, W.-M. (2016). Service co-creation in social media: an extension of the theory of planned behavior. Computers in Human Behavior, 65, 260-266. https://doi.org/10.1016/j.chb.2016.08.031

Cook, J., Lewandowsky, S., & Ecker, U. K. H. (2017). Neutralizing misinformation through inoculation: exposing misleading argumentation techniques reduces their influence. PLoS One, 12(5), e0175799. https://doi.org/10.1371/journal.pone.0175799

Crozier, W. E., & Strange, D. (2019). Correcting the misinformation effect. Applied Cognitive Psychology, 33(4), 585-595. https://doi.org/10.1002/acp.3499

Demartini, G. (2019, September 23). Users (and their bias) are key to fighting fake news on Facebook – AI isn’t smart enough yet. The Conversation. https://theconversation.com/users-and-their-bias-are-key-to-fighting-fake-news-on-facebook-ai-isnt-smart-enough-yet-123767

Dhawan, B. (2020, 5 May). WhatsApp forwards can now be checked for misinformation using this chatbot! Financial Express. https://www.financialexpress.com/industry/technology/whatsapp-forwards-can-now-be-checked-for-misinformation-using-this-chatbot/1948540/

Edgerly, S., Mourão, R. R., Thorson, E., & Tham, S. M. (2020). When do audiences verify? How perceptions about message and source influence audience verification of news headlines. Journalism & Mass Communication Quarterly, 97(1), 52-71. https://doi.org/10.1177/1077699019864680

Einstein, K. L., & Glick, D. M. (2015). Do I think BLS data are BS? The consequences of conspiracy theories. Political Behavior, 37(3), 679-701. https://doi.org/10.1007/s11109-014-9287-z

Ferregl, E. (2020, 5 August). Navigating fake news: How Americans should deal with misinformation online. Duke Today. https://today.duke.edu/2020/08/navigating-fake-news-how-americans-should-deal-misinformation-online

Fishman, J., Yang, C. & Mandell, D. (2021). Attitude theory and measurement in implementation science: a secondary review of empirical studies and opportunities for advancement. Implementation Science, 16,Article 87),. https://doi.org/10.1186/s13012-021-01153-9

Fowler, G. A. (2020, 5 June). You are probably spreading misinformation. Here’s how to stop. The Washington Post. https://www.washingtonpost.com/technology/2020/06/05/stop-spreading-misinformation/

Geiger, A. W. (2019, September 11). Key findings about the online news landscape in America. https://www.pewresearch.org/fact-tank/2019/09/11/key-findings-about-the-online-news-landscape-in-america/

Hameleers, M., & van der Meer, T. G. (2020). Misinformation and polarization in a high-choice media environment: how effective are political fact-checkers? Communication Research, 47(2), 227-250. https://doi.org/10.1177/0093650218819671

Hausenblas, H. A., Carron, A. V., & Mack, D. E. (1997). Application of the theories of reasoned action and planned behavior to exercise behavior: a meta-analysis. Journal of Sport and Exercise Psychology, 19(1), 36-51. https://doi.org/10.1123/jsep.19.1.36

Hoffman, J. (2019). How anti-vaccine sentiment took hold in the United States. The New York Times. https://www.nytimes.com/2019/09/23/health/anti-vaccination-movement-us.html

Hsueh, M., Yogeeswaran, K., & Malinen, S. (2015). “Leave your comment below”: can biased online comments influence our own prejudicial attitudes and behaviors? Human Communication Research, 41(4), 557-576. https://doi.org/10.1111/hcre.12059

Jun, Y., Meng, R., & Johar, G. V. (2017). Perceived social presence reduces fact-checking. Proceedings of the National Academy of Sciences, 114(23), 5976-5981. https://doi.org/10.1073/pnas.1700175114

Kappe, R., & Schuster, C. (2022). Agents of past principals: the lasting effects of incumbents on the political ideology of bureaucrats. European Journal of Political Research, 61(3), 807-828. https://doi.org/10.1111/1475-6765.12473

Kim, A., Moravec, P. L., & Dennis, A. R. (2019). Combating fake news on social media with source ratings: The effects of user and expert reputation ratings. Journal of Management Information Systems, 36(3), 931-968. https://doi.org/10.1080/07421222.2019.1628921

Kuklinski, J. H., Quirk, P. J., Jerit, J., Schwieder, D., & Rich, R. F. (2000). Misinformation and the currency of democratic citizenship. Journal of Politics, 62(3), 790-816.

Lee, C. S., & Ma, L. (2012). News sharing in social media: the effect of gratifications and prior experience. Computers in Human Behavior, 28(2), 331-339. https://doi.org/10.1111/0022-3816.00033

Lee, J., Park, D.-H., & Han, I. (2008). The effect of negative online consumer reviews on product attitude: an information processing view. Electronic Commerce Research and Applications, 7(3), 341-352. https://doi.org/10.1016/j.elerap.2007.05.004

Lee, M. C. (2009). Understanding the behavioural intention to play online games. Online Information Review, 33(5), 849-872. https://doi.org/10.1108/14684520911001873

Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition, 6(4), 353-369. https://doi.org/10.1016/j.jarmac.2017.07.008

Ling, K. C., Chai, L. T., & Piew, T. H. (2010). The effects of shopping orientations, online trust and prior online purchase experience toward customers' online purchase intention. International Business Research, 3(3), 63. https://doi.org/10.1016/j.jarmac.2017.07.008

MacKenzie, A., & Bhatt, I. (2020). Opposing the power of lies, bullshit and fake news: the value of truth. Postdigital Science and Education, 2(1), 217-232. https://doi.org/10.1007/s42438-019-00087-2

Marcinkowski, F., & Metag, J. (2014). Why do candidates use online media in constituency campaigning? An application of the theory of planned behavior. Journal of Information Technology & Politics, 11(2), 151-168. https://doi.org/10.1080/19331681.2014.895690

McManus, J. H. (2009). The commercialization of news. The Handbook of Journalism Studies. Routledge. https://doi.org/10.4324/9780203877685-25

Mourão, R. R., & Robertson, C. T. (2019). Fake news as discursive integration: an analysis of sites that publish false, misleading, hyperpartisan and sensational information. Journalism Studies, 20(14), 2077-2095. https://doi.org/10.1080/1461670X.2019.1566871

Murray, A. (2016, November 22). How to report fake news to social media. BBC News. https://www.bbc.com/news/38053324

Nielsen, R. K., & Graves, L. (2017). " News you don't believe": audience perspectives on fake news. https://reutersinstitute.politics.ox.ac.uk/our-research/news-you-dont-believe-audience-perspectives-fake-news

Pal, A., Chua, A. Y. K., & Hoe-Lian Goh, D. (2019). Debunking rumors on social media: the use of denials. Computers in Human Behavior, 96, 110-122. https://doi.org/10.1016/j.chb.2019.02.022

Pearson, G. D., & Kosicki, G. M. (2017). How way-finding is challenging gatekeeping in the digital age. Journalism Studies, 18(9), 1087-1105. https://doi.org/10.1080/1461670X.2015.1123112

Pelling, E. L., & White, K. M. (2009). The theory of planned behavior applied to young people's use of social networking web sites. CyberPsychology & Behavior, 12(6), 755-759.

Ribertson, A. (2019, 3 December). How to fight lies, tricks, and chaoes online. The Verge. https://www.theverge.com/21276897/fake-news-facebook-twitter-misinformation-lies-fact-check-how-to-internet-guide

Sassenberg, K., & Jonas, K. J. (2007). Attitude change and social influence on the net. Oxford Handbook of Internet Psychology, 273-288. https://doi.org/10.1089/cpb.2009.0109

Scherer, L. D., & Pennycook, G. (2020). Who is susceptible to online health misinformation? American Journal of Public Health, 110, S276-S277. https://doi.org/10.1037/hea0000978

Schwarzenegger, C. (2020). Personal epistemologies of the media: selective criticality, pragmatic trust, and competence–confidence in navigating media repertoires in the digital age. New Media & Society, 22(2), 361-377. https://doi.org/10.1177/1461444819856919

Shearer, E. (2021, 12 January). More than eight-in-ten Americans get news from digital devices. https://www.pewresearch.org/fact-tank/2021/01/12/more-than-eight-in-ten-americans-get-news-from-digital-devices/

Shin, J., & Thorson, K. (2017). Partisan selective sharing: the biased diffusion of fact-checking messages on social media. Journal of Communication, 67(2), 233-255. https://doi.org/10.1111/jcom.12284

Smith, T. W., Davern, M., Freese, J., & Morgan, S. (2018). General Social Surveys. gssdataexplorer.norc.org

Sommer, L. (2011). The theory of planned behaviour and the impact of past behaviour. International Business & Economics Research Journal (IBER), 10(1). https://doi.org/10.19030/iber.v10i1.930

Southwell, B. G., & Torres, A. (2006). Connecting interpersonal and mass communication: science news exposure, perceived ability to understand science, and conversation. Communication Monographs, 73(3), 334-350. https://doi.org/10.1080/03637750600889518

Sutton, S. (1994). The past predicts the future: interpreting behaviour–behaviour relationships in social psychological models of health behaviour. In D. R. Rutter & L. Quine (Eds.), Social psychology and health: European perspectives (pp. 71–88). Avebury/Ashgate Publishing Co. https://doi.org/10.1016/0738-3991(96)00882-8

Swire, B., Berinsky, A. J., Lewandowsky, S., & Ecker, U. K. (2017). Processing political misinformation: comprehending the Trump phenomenon. Royal Society Open Science, 4(3), 160802. https://doi.org/10.1098/rsos.160802

Tandoc, E. C., Lim, D., & Ling, R. (2020). Diffusion of disinformation: how social media users respond to fake news and why. Journalism, 21(3), 381-398. https://doi.org/10.1177/1464884919868325

Tandoc, E. C., Lim, Z. W., & Ling, R. (2018). Defining “fake news” A typology of scholarly definitions. Digital Journalism, 6(2), 137-153. https://doi.org/10.1080/21670811.2017.1360143

Tandoc, E. C., Ling, R., Westlund, O., Duffy, A., Goh, D., & Zheng Wei, L. (2018). Audiences’ acts of authentication in the age of fake news: a conceptual framework. New Media & Society, 20(8), 2745-2763. https://doi.org/10.1177/1461444817731756

Tang, Z., Miller, A. S., Zhou, Z., & Warkentin, M. (2022). Understanding rumor combating behavior on social media. Journal of Computer Information Systems62(6), 1112-1124. https://doi.org/10.1080/08874417.2021.1983486

Teo, T. S., & Lim, V. K. (1996). Factors influencing personal computer usage: the gender gap. Women in Management Review, 11(8), 18-26. https://doi.org/10.1108/09649429610148746

Thorson, K., & Wells, C. (2015). How gatekeeping still matters: Understanding media effects in an era of curated flows. In T. Vos & F. Heinderyckx (Eds.), Gatekeeping in Transition (pp. 25-44). Routledge. https://doi.org/10.4324/9781315849652

Tsipursky, G., Votta, F., & Roose, K. M. (2018). Fighting fake news and post-truth politics with behavioral science: the pro-truth pledge. Behavior and Social Issues, 27, 47-70. https:doi.org/10.2139/ssrn.3138238

Tugend, A. (2020, February 20). These Students Are Learning About Fake News and How to Spot It. The New York Times. https://www.nytimes.com/2020/02/20/education/learning/news-literacy-2016-election.html

Tyson, M., Covey, J., & Rosenthal, H. E. (2014). Theory of planned behavior interventions for reducing heterosexual risk behaviors: a meta-analysis. Health Psychology, 33(12), 1454-1467. https://doi.org/10.1037/hea0000047

U.S. Census Bureau. (2019). Age and Sex Composition in the United States: 2019. https://www.census.gov/data/tables/2019/demo/age-and-sex/2019-age-sex-composition.html

van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. Global Challenges, 1(2), 1600008. https://doi.org/10.1002/gch2.201600008

Van Prooijen, J.-W., & Douglas, K. M. (2017). Conspiracy theories as part of history: the role of societal crisis situations. Memory studies, 10(3), 323-333. https://doi.org/10.1177/1750698017701615

Verplanken, B., Aarts, H., & Van Knippenberg, A. (1997). Habit, information acquisition, and the process of making travel mode choices. European Journal of Social Psychology, 27(5), 539-560. https://doi.org/10.1002/(SICI)1099-0992(199709/10)27:5<539::AID-EJSP831>3.0.CO;2-A

Vo, N., & Lee, K. (2018). The rise of guardians: fact-checking url recommendation to combat fake news. Paper presented at the The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. https://doi.org/10.1145/3209978.3210037

Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151. https://doi.org/10.1126/science.aap9559

Vraga, E. K., & Bode, L. (2018). I do not believe you: how providing a source corrects health misperceptions across social media platforms. Information, Communication & Society, 21(10), 1337-1353. https://doi.org/10.1080/1369118X.2017.1313883

Vraga, E. K., Tully, M., Maksl, A., Craft, S., & Ashley, S. (2020). Theorizing news literacy behaviors. Communication Theory31(1), 1-21 https://doi.org/10.1093/ct/qtaa005

Walter, N., & Murphy, S. T. (2018). How to unring the bell: a meta-analytic approach to correction of misinformation. Communication Monographs, 85(3), 423-441. https://doi.org/10.1080/03637751.2018.1467564

Wang, T., & Yu, W. (2022). Alternative sources use and misinformation exposure and susceptibility: the curvilinear moderation effects of socioeconomic status. Telematics and Informatics, 70, 101819. https://doi.org/10.1016/j.tele.2022.101819

Wang, W., & Huang, Y. (2021). Countering the “harmless e-cigarette” myth: The interplay of message format, message sidedness, and prior experience with e-cigarette use in misinformation correction. Science Communication, 43(2), 170-198. https://doi.org/10.1177/1075547020974384

Weigel, F. K., Hazen, B. T., Cegielski, C. G., & Hall, D. J. (2014). Diffusion of innovations and the theory of planned behavior in information systems research: a meta analysis. Communications of the Association for Information Systems, 34(1), 31. https://doi.org/ 10.17705/1CAIS.03431

Wendling, M. (2017, 30 January). Solutions that can stop fake news spreading. BBC News. https://www.bbc.com/news/blogs-trending-38769996

Wintersieck, A. L. (2017). Debating the truth: the impact of fact-checking during electoral debates. American Politics Research, 45(2), 304-331. https://doi.org/10.1177/1532673X16686555

Xu, L., Ling, M., Lu, Y., & Shen, M. (2017). Understanding household waste separation behaviour: Testing the roles of moral, past experience, and perceived policy effectiveness within the theory of planned behaviour. Sustainability, 9(4), 625. https://doi.org/10.3390/su9040625

Yan, Z., Li, Z., Panadero, E., Yang, M., Yang, L., & Lao, H. (2021). A systematic review on factors influencing teachers’ intentions and implementations regarding formative assessment. Assessment in Education: Principles, Policy & Practice28(3), 228-260.

Zhao, L., Yin, J., & Song, Y. (2016). An exploration of rumor combating behavior on social media in the context of social crises. Computers in Human Behavior58, 25-36. https://doi.org/10.1016/j.chb.2015.11.054