Marketplaces of Misinformation: A Study of How Vaccine Misinformation Is Legitimized on Social Media

Cardiff University (Di Domenico); University of Warwick (Nunan); University of Surrey (Pitardi)
"The spread of vaccine misinformation poses a serious threat to public health....The dissemination of this content is amplified and facilitated by social media, yet little is known about how this process starts and how it is legitimized..."
Social media's role in disseminating vaccine misinformation has been recognised as a major public health challenge. Just as influencers play a key role in generating engagement on social media, the rise in the self-publishing of books through online marketplaces gives individuals the ability to spread misinformation. When this content reaches social media, reach is magnified, feeding into the complex digital information environment. Through a series of experiments, this article examines the processes through which health misinformation from online marketplaces is legitimised and spread.
First, the article reviews literature on health misinformation, social media, and legitimacy. It then introduces the study's methodology, which involved determining what counts as misinformation and selecting material based on whether the books sold on the Amazon marketplace promoted beliefs based on vaccine hesitancy. The data analysis comprised two stages. First, the researchers analysed a data set of descriptions of 28 books containing vaccine misinformation on Amazon and 649 Facebook posts linking directly to these books on Amazon. Second, they adopted an experimental approach to empirically test the findings from that analysis.
Study 1, which involved the content analysis of the 28 vaccine misinformation books on Amazon and the 649 associated Facebook posts, examined: (i) how misinformation is legitimised on Amazon through the analysis of book descriptions, (ii) how misinformation is transferred and legitimated by Facebook users, and (iii) how legitimisation occurs between the two digital platforms. The researchers confirmed three dimensions of legitimacy identified through the literature:
- Cognitive legitimacy: conveyed through narratives reporting a conspiratorial view of the world blaming the pharmaceutical establishment, the media system, and the government for promoting their vaccine agenda against individuals' well-being;
- Pragmatic legitimacy: explained by claims about how reading the books will benefit the readers and guide them toward an informed decision about vaccines; and
- Moral legitimacy: indicative of a positive normative evaluation of the organisation and its activities, conferred mainly by "revealing" the "dangerous" composition of vaccines and their consequences.
In addition, the researchers identified two new dimensions:
- Expert legitimacy: evoked through the presentation of the books' authors as experts in their fields; and
- Algorithmic legitimacy: derived from the logic behind the inclusion of a particular book in Amazon's algorithmically determined Best Seller categories or book reviews.
Through this analysis, 12 subthemes emerged from the five dimensions; see Table 2 in the paper, which includes examples of references and quotes.
Study 2, which involved 191 United States (US) consumers, focused on legitimacy development and showed the positive effect of expert cues on the perceived legitimacy of misinformation books while assessing the moderating role of vaccine hesitancy. Specifically, the researchers found that the presence of expert cues increases perceptions of misinformation books' legitimacy in individuals who display a positive attitude toward vaccines, whereas it decreases the same legitimacy perceptions in individuals with high levels of vaccine hesitancy. This result highlights the important role of prior beliefs on information evaluation and legitimacy perceptions. As vaccine-hesitant individuals distrust science, their information processing is biased towards de-legitimising contents shared by individuals who are part of the (nontrustworthy) medical establishment. Similarly, individuals who are supportive of vaccines, and thus of science, trust the expertise of those credentialed people who share vaccine information. As such, the legitimation through expertise identified in this research could potentially instil doubts regarding vaccine effectiveness and increase the general levels of vaccine hesitancy.
Study 3, which involved 399 US consumers, builds on the results from Study 2 and tests the validity of these relationships in a social media context. This study also includes a behavioural measure and examines the mediating role of legitimacy in driving consumers' sharing behaviour on social media. Finally, it tests whether the effects identified change as a function of the type of book (factual vs. misinformation). The results show that the persuasive effect of expertise through legitimacy positively influences sharing behaviour. Also, expert cues significantly increase legitimacy perceptions only in vaccine misinformation books. The indirect effect of expert cues through legitimacy is magnified when individuals display low levels of vaccine hesitancy.
Based on these findings, the article provides a general discussion and implications for both theory and policy. Noting that this article has identified forms of misinformation that have largely escaped existing regulatory control, the researchers suggest that:
- By focusing on individual platforms, such as Facebook and WhatsApp, policymakers have ignored the complexity of online information flows between platforms. In this study, high-salience misinformation is legitimised through one platform (an online book marketplace), which in turn increases spread through another platform (a social media site). Where the threat of regulatory pressure is present, particularly with health-related misinformation, social media platforms have shown themselves willing to self-regulate - e.g., not through censorship but, rather, through labeling and limiting algorithmic promotion to hinder discoverability. The same recommendations apply to the regulation of online marketplaces.
- The forms of misinformation identified in this article represent the misuse of credentials - e.g., the use of medical credentials by physicians who have been banned from practicing or the use of nonmedical credentials (e.g., a doctorate in a nonmedical subject) to imply medical expertise. Policymakers can address the role of professional accreditation in underwriting this legitimacy by using existing legal tools.
The article concludes with some avenues for future research, such as studies that ask: "To what extent are companies responsible for the spreading of misinformation? Should they limit the autonomy of algorithms and, in general, artificial intelligence in making business decisions? To what extent, then, is business ethics compatible with business purposes?"
Journal of Public Policy & Marketing 2022, Vol. 41(4) 319-35. DOI: 10.1177/07439156221103860. Image credit: Emmanuel Ikwuegbu via Pexels (free to use)
- Log in to post comments











































