Skip to main content

Misinformation, Vaccination, and Social Media Use Across the Political Aisle   

January 17, 2023

Two years removed from the onset of the COVID-19 pandemic, Americans’ social, economic, and political lives have been forever changed by the rapid digitalization brought on by social distancing measures. An absence of in-person contact left many Americans to turn to their own devices — quite literally — to maintain their social connections and seek information on the coronavirus pandemic as it developed.

 

The Next Frontier for Public Health? Your Instagram Feed

Nguyen et al. (2020) found that 36 percent of U.S. adults in April 2020 reported using video calls more frequently during the pandemic to connect with friends and family, and 35 percent reported using social media more often for the same purpose. Surveying a representative sample of the U.S. population in January 2021, Neely et al. (2021) also discovered that, among Americans with internet access, 76 percent claimed to rely on social media at least somewhat for news about the coronavirus pandemic, and 32 percent claimed to read COVID-19 related news on social media every day.

Americans’ trust in unmoderated content on social media, which is amplified by a faltering sense of trust in government and health institutions, poses a serious threat to vaccine and health messaging. According to a November 2020 report by the Pew Charitable Trusts, those who followed news through social media were less politically knowledgeable and had higher exposure to COVID-19 misinformation than those who followed any other news source. Despite this, survey research has shown that social media use has the potential to combat misinformation and improve health outcomes. Health information-seeking behavior (HISB) on sites like Instagram and Facebook varies substantially from person to person, and certain behaviors correlate strongly with reductions in vaccine hesitancy. Neely et al. (2021) found that 76 percent of social media users followed at least one authoritative or credible scientific source, and those who followed four or more scientific sources were 25 percentage points more likely to say they would “definitely get vaccinated” than those who followed none.1

 

Digital is Political: Addressing the Elephant (and Donkey) in the Room

Besides the threat of online disinformation, partisanship is often discussed as a key driver in group-level disparities in COVID-19 vaccination. Not only have liberal media sources shone an unflinching spotlight on the polemical post-pandemic rhetoric used by Republican officials, but studies have shown that the red-blue divide is real and growing. Through aggregating county-level data, KFF (Kaiser Family Foundation) found that counties, where a majority voted for Joe Biden, had an average vaccination rate 13 percentage points higher than those where a majority voted for Donald Trump, a gap that has only grown since the vaccine was introduced. Furthermore, Levin and Bradshaw (2022) analyzed Gallup’s Values and Beliefs of the American Public Survey and found that COVID-19 skepticism and SARS-CoV2 vaccine hesitancy were correlated in “an almost linear fashion” withholding moral and political values aligned with the Republican party.

While disparities in U.S. vaccination rates by race and ethnic group continue to shrink over time, political leaning persists as a barrier for Americans to agree on foundational facts about COVID-19 and the SARS-CoV2 vaccine. As such, this article explores whether trust in social media correlates significantly with vaccine misinformation among U.S. adults conditional on political party preference. More specifically, I seek to answer the following descriptive questions: how misinformed are Republicans and Democrats who trust the COVID-19 content they consume on social media? Does trust in social media uncouple the relationship between political party and COVID-19 misinformation? If so, what strategies might health officials use to mitigate the threat of partisan misinformation online?

 

Methodology

          KFF COVID-19 Vaccine Monitor Data

To answer this question, I used microdata from the January 2021 (n=1563) and October 2021 (n=1519) KFF COVID-19 Vaccine Monitors, which are nationally representative RDD (random digit dial) telephone surveys of U.S. adults’ COVID-19 perceptions. These are the only KFF Vaccine Monitors in the Roper Center archive that ask questions on social media use for which datasets have been released, though these questions have been asked by KFF in more recent 2022 Vaccine Monitors. The sampling frames of the two surveys were stratified to provide better coverage of non-Hispanic Black, Hispanic, and vaccine-hesitant communities. KFF conducted surveys in Spanish and English and purposefully oversampled prepaid telephone numbers to improve coverage of lower-income non-white respondents. I used analytical weights generated by KFF to match each sample to the 2019 American Community Survey (ACS) estimates with respect to key demographics and control for differential telephone use.

The KFF COVID-19 Vaccine Monitor is an especially valuable tool to answer questions about online behavior during the coronavirus pandemic because its sampling method is designed around random telephone dialing. As mentioned in Footnote 1, much of the previous research on vaccine hesitancy and online behavior has relied on web-based surveys, either weighting or stratifying their sample to resemble the U.S. population on key demographic characteristics such as age and race. Regardless of any sampling corrections, web-based surveys for questions about online behavior open the door for potential selection bias. The sampling format itself is endogenous, and the online design would likely be correlated with respondents’ online behaviors. It is likely that a random sample of those who can navigate an online survey does not provide a valid representation of all Americans, and, even if the survey is weighted on certain demographics, the survey oversamples tech-savvy individuals who would be active online and more informed about the COVID-19 vaccine. The KFF Vaccine monitor gives us the opportunity to rigorously test questions about online behavior using a phone survey that is more robust against selection bias.

 

Statistical Methods and Variable Coding

To describe who aligns with each party and where they are getting their information from, I first ran weighted crosstabs and t-tests of both surveys. I then turned to the October 2021 Vaccine Monitor to explore the relationship between political party, trust in social media platforms, and COVID-19 misinformation. I generated a misinformation index by aggregating eight true or false questions in the October 2021 survey that asked whether respondents found common misconceptions about COVID-19 to be true, including whether ivermectin is an effective treatment for COVID-19, whether COVID-19 vaccines can change your DNA, and whether the government is hiding deaths due to the COVID-19 vaccine. I weighted each of these misconceptions equally and generated an index from 0 to 1, with 1 being an individual who found all eight statements to be true. To test for internal reliability, I calculated Cronbach’s alpha for this index and received an acceptable scale reliability coefficient of 0.74. I repeated this process to generate a second index for mere exposure to these misconceptions.

I then conducted a series of four increasingly specific ordinary least squares (OLS) regression models. The first model tested the relationship between the COVID-19 misinformation index and political party affiliation; the second included relevant demographic controls (age, race/ethnic group, income, region, parental status, and whether respondent identified as Evangelical Christian); the third model added in respondents’ trust in online media sources; finally, the fourth model factored the interaction of online trust and political party to determine whether the harms of social media use are moderated by identifying as a Republican or Democrat. This regression model was repeated using exposure to misinformation as an additional outcome variable. I omitted those who identified as independent to focus the analysis on the potency of aligning with a political party. All regressions were conducted with heteroskedasticity consistent standard errors (the vce(hc3) option on STATA). Since this study is solely exploratory, these OLS regressions are being used as a tool to test for the strength of correlational relationships while controlling for endogenous variables.

 

Results

  1. Differences Between Democrats and Republicans

Those who identified as Republicans in the January 2021 survey were significantly more male (diff = -0.059; p = 0.038), held fewer bachelor’s degrees (diff = -0.071; p = 0.008), and were overwhelmingly white. Figure 1 below shows the weighted racial composition of Democrats and Republicans in the January 2021 Vaccine Monitor. Of the four census regions, there were significantly more Democrats in the Northeast (diff = 0.053; p = 0.016) and significantly more Republicans in the South (diff = 0.06; p = 0.030). Furthermore, Democrats sampled in the January 2021 Vaccine Monitor were more likely to be in lower-income bins than Republicans. By October 2021, these differences between Republicans and Democrats persisted, and they grew in the cases of gender and college degree attainment.

 

  1. Who’s Looking for COVID-19 News on Social Media? (January 2021)

          The January 2021 survey provides some clues regarding how the use of social media for health information-seeking is distributed across the U.S. population. Weighted cross-tabulations show that, overall, 53.9% of the sample relied on social media sites at least “a little” in the past two weeks for information on COVID-19. However, reliance on social media sources differed by party; less than half (47%) of Republicans received information on COVID-19 through social media in the past two weeks, while nearly 60 percent (59%) of Democrats did.

 

Race/Ethnic Breakdown by Political Party in KFF COVID-19 Vaccine Monitor SampleFigure 1. Race/Ethnic Breakdown by Political Party in KFF COVID-19 Vaccine Monitor Sample (January 2021)

 

Interestingly, YouTube is uniquely relied upon by those in lower-income groups and those with less than a 4-year college degree for information on COVID-19, perhaps due to the accessibility of video content. Furthermore, reliance on social media for news about COVID-19 differs greatly by racial and ethnic group — while a little less than one-half of the Non-Hispanic (NH) white sample claimed to use social media in the past two weeks to get information on COVID-19, two-thirds of Hispanic (67%) and 57 percent of NH Black Americans relied on social media sites. Even among NH Black and Hispanic respondents, particular groups were more likely to use particular sites. Of all social media sites, Black respondents were significantly more likely to get COVID-19 news from Instagram (diff = 0.081; p < 0.001) than the rest of the sample. Likewise, Hispanic respondents were significantly more likely to get COVID-19 information from Instagram (diff = 0.095; p < 0.001), Facebook (diff = 0.071; p = 0.018), and YouTube (diff = 0.121; p < 0.001) than all other respondents. Overall, respondents of color were 12 percentage points more likely to get information from social media than NH white respondents.

 

  1. Who Trusts Social Media, Anyway? (October 2021)

The follow-up survey in October 2021 asked a different question about news sources, instead asking respondents how much they trust COVID-19 and vaccine information coming from Facebook, Instagram, TikTok, Twitter, or YouTube. Overall, 24% of the entire sample trusted information from a social media site at least somewhat. While Democrats trusted social media sources significantly more than Republicans, the difference was only roughly six percentage points (despite the much larger difference in who was looking at social media content in January 2021). Racial and ethnic distinctions persisted; in October, 32% of POC respondents trusted at least one social media source for COVID-19 information, while only 19% of NH white respondents said the same (diff = 0.130; p < 0.001). Overall, those in the sample were most likely to express trust in YouTube (13%) out of all social media sources, with around 1 in 5 NH Black and Hispanic respondents saying they trust the video and streaming platform. However, the differences across education and income observed in January 2021 lose their significance when analyzing media trust in October 2021.

 

  1. Regression Output - Party and Trust in Social Media (October 2021)

          Using the October 2021 KFF Vaccine Monitor data, a four-step regression model was employed to explore how trust in social media and political party correlated with COVID-19 misinformation (see Methodology for more details). Table 1 indicates that, holding nothing else constant, identifying as a Republican is associated with a 17 percentage point increase in COVID-19 and SARS-CoV2 vaccine misinformation. Considering the index is comprised of eight separate misconceptions, Republicans on average believed a little over one more incorrect fact than Democrats in the sample. Controlling for demographics does not substantially change the significance of this relationship. Holding demographics and political party constant, trust in at least one social media platform (Column 3) is associated with an 9 percentage point increase in misinformation (p < .01). Interacting trust in social media with party yields much more nuanced results — while trusting content on social media does not significantly correlate with misinformation among Democrats, for Republicans it is associated with a 14 percentage point increase in the misinformation index. Political party remains a statistically significant covariate, suggesting that the interaction between party and trust and social media does not capture much of the variation in the belief of COVID-19-related misinformation from political party alone (see Table 1).

 

Table 1. Belief in COVID-19 Misinformation by Party and Trust in Social Media Trust

Table 1. Belief in COVID-19 Misinformation by Party and Trust in Social Media Trust

          The four-step regression model was repeated for whether respondents had been exposed to vaccine misconceptions in the past at all. While the level of exposure did not differ significantly by political party in our simple regression model, it did correlate with trust in online sources for COVID-19 information (see Table 2, Column 3). However, Column 4 suggests that Democrats on social media are more protected from misinformation than other Democrats (4 percentage point decrease in exposure) while Republicans on social media experience added risk (a 15 percentage point increase).

 

Table 1. Belief in COVID-19 Misinformation by Party and Trust in Social Media TrustTable 2. Exposure to COVID-19 Misinformation by Party and Social Media Trust

          Figures 2 and 3 graphically depict the interactions between media trust and political party for each of the five social media platforms asked about in the October 2021 Vaccine Monitor. Democrats’ foundational understanding of COVID-19 and vaccine safety appear to be independent of their trust in any of the five platforms, on average (Figure 2). Meanwhile, holding constant demographic characteristics, Republicans on TikTok are twice as likely to believe COVID-19 misconceptions than Republicans who trust no social media site. For every site, the difference between Democrats and Republicans is at least twice as large as for those who do not trust any social media site for COVID-19 news, and all these differences are statistically significant at the 5% significance level. This difference is facilitated almost entirely by spikes in Republican misinformation.

 

 

 

 

 

 

 

 

 

 

 

Figure 2. Platform-Specific Belief in Misinformation by Political Party

 

Platform-Specific Belief in Misinformation by Political Party

 

          Looking at exposure, the relationship between platform use and misinformation persists (see Figure 3) — trust in every single platform was associated with decreased exposure to COVID-19 misinformation for Democrats and increased exposure for Republicans. Clearly, Republicans who placed trust in any site had heightened exposure to misinformation compared to Republicans with no trust in COVID-19 information on social media, while, for Democrats, social media seems to be protective against misinformation. For example, TikTok had an exposure index roughly 25 percentage points higher than Republicans who did not; on the other hand, Democrats who placed trust in TikTok had an exposure index 9 percentage points lower than Democrats who did not. Looking solely at how party moderates the effect of social media, the interaction terms were statistically significant for TikTok (coef = .389; p = 0.000), YouTube (coef = .220; p = 0.001) and Instagram (coef = .248; p = 0.038). 

 

Figure 3. Platform-Specific Exposure to Misinformation by Political Party

Chart, box and whisker chart

Description automatically generatedLooking to the Future

          The results of our descriptive work yield novel results regarding the prevalence of misinformation among Republican and Democrat social media users, as well as several findings that support the existing literature on vaccine hesitancy. Similar to the analysis conducted by Levin and Bradshaw (2022) using Gallup’s Values and Beliefs of the American Public Survey, I found that Republican social media users are substantially more misinformed than the general public, while Democrats who use social media are not any better or worse off. These results imply that believing and being exposed to misinformation online is a risk faced mainly by those with conservative political ideals, but more rigorous research designs (pretest-posttest survey designs, for example) must be conducted to establish time order and tease out any sort of causal inference.

          Controlling for social media trust did not completely uncouple political party identification from belief in common COVID-19 misconceptions (the effect size dropped from ~16 to ~12 percentage points). However, trust in sites like TikTok and YouTube for COVID-19-related information was significantly correlated with Republicans’ belief in misinformation, and the strength of this relationship was greater than the one solely ascribed to party preference (Figure 3). Exposure to misinformation was nearly identical for Republicans and Democrats who did not trust social media for COVID-19 information, but misinformation exposure increased for Republicans and shrunk for Democrats who trusted these sites (Figure 4). This raises questions of a social media “echo chamber” — through self-selection into groups and algorithmic content curation by AI, individuals are often offered information that aligns with their interests and beliefs, further polarizing groups that are more hesitant or eager to get vaccinated. It is also interesting to note that the largest platform-specific differences between Democrats and Republicans were on TikTok, YouTube, and Instagram, platforms which are by visual content and feature robust algorithms for recommending new content.

 Looking forward, causal research designs are needed to test for the complex mechanisms through which party might influence online behavior and vaccine misinformation. To do so would be the first step to dismantle the echo chamber and reveal the inherent dangers misinformation online can pose to those with conservative politics. Targeted ad campaigns for conservatives (painting the SARS-CoV2 vaccine as a religious/moral obligation or showing military endorsement) have been used by nonprofits and government agencies to counteract vaccine hesitancy online. While controversial, microtargeting — targeting subsets of subsets — has been a highly effective way to make use of limited resources when popularizing a health campaign. Breza et al. (2021) found that large-scale social media advertising bridged the American partisan gap in reducing holiday travel distance and, subsequently, COVID-19 infection rates. Future research could also explore how content moderation policies could work to close the red/blue misinformation gap and how the qualities of each social media platform might lend them to being effective tools for guiding users towards (or misguiding users away from) well-informed health practices.

By Andreas Psahos
Roper Center Kohut Fellow

Notes

  1. While these results by Neely et al. (2021), Nguyen et al. (2020), and the Pew Charitable Trusts inform my focus on vaccine hesitancy and internet use, their web-based sampling design leaves them vulnerable to positive selection bias. Those who are recruited for and can take online surveys are more likely to use video calling and social media than the nation at large, which would positively bias any estimates about the frequency of online behaviors in these studies.