The Impact of Toxic Trolling Comments on Anti-vaccine YouTube Videos

Indiana University Bloomington (Miyazaki, Kwak, An); Sugakubunka Co., Ltd. (Uchiba); Tokyo Institute of Technology (Sasahara)
"...research bears essential implications for managing public health messaging and online communities, particularly in moderating fear-mongering messages about vaccines on social media."
YouTube has become a primary source of information for many individuals regarding vaccines. The comment section on YouTube videos not only provides feedback for video creators but also serves as a venue for communication and information sharing among viewers. Comments on online content shape viewers' perceptions of the content itself. However, the comment section is often plagued by uncivil comments by so-called "trolls", particularly on anti-vaccine videos. Online toxicity can incite fear, a connection especially pertinent in the context of vaccine hesitancy. Drawing on 484 anti-vaccine videos and 414,436 corresponding comments, the quantitative analysis in this study examines the relationship between toxicity and fear in the comment sections of YouTube videos related to anti-vaccine content.
This study examined whether toxicity in comments for a video is associated with the level of fear expressed in the comments for the same video on YouTube. To do so, the researchers first analysed the relationship between toxicity and fear at the video level. The definition of a toxic comment is "a rude, disrespectful, or unreasonable comment that is likely to make you leave a discussion". Such comments include offensive to others, negative, or hateful. The researchers employed a machine learning approach - Google's Perspective API and a RoBERTa-based model - to quantify fear and toxicity levels in each comment and computed their mean for each video.
First, the researchers identified the lack of relationship between toxicity and fear in individual comments. This is because if toxicity and fear in individual comments were strongly correlated or equivalent, the subsequent results at the aggregated level would be trivial. They then focused on both the video and the comment levels, specifically, early and later comments, while controlling for other relevant variables, to gain insights into the association between toxicity and fear in comments. They found a substantial connection between toxicity and fear in YouTube comments when analysed at the video level. This finding suggests that toxicity and fear cooccur within the comment sections. The phenomenon of emotional contagion, in which fear in a video's title, description, and transcripts correlates with fear in comments, highlights the association of emotions in video content. Early fear is strongly associated with later fear, confirming the contagion of homogeneous emotions.
Another key finding was the greater association of toxicity in highly liked comments, which were approximately 30% more influential than in ordinary comments. "This calls for a re-evaluation of the policies and algorithms that amplify the visibility of liked comments on online platforms. For example, platforms could remove highly toxic comments, or at minimum, place them lower in the display order, regardless of their like count." In addition, the study found that fear in comments was significantly associated with the topics of viruses and children's diseases, which aligns with previous research linking these topics to fear among anti-vaccine groups.
Thus, these findings suggest that initial troll comments can evoke negative emotions in viewers, potentially fueling vaccine hesitancy. The results may have implications for moderation policies on online platforms. Conventionally, the focus on toxicity in online comments has centred on its direct impact on the target of the message, such as the owner of the video. However, this study highlights how comments also significantly associate with the emotions of other viewers. "As commenting on various online content, such as videos, news, and e-commerce, profoundly impacts user experiences, it is imperative for platform providers to consider the wider effects of toxic messages..."
Scientific Reports (2024) 14:5088. https://doi.org/10.1038/s41598-024-54925-w. Image credit: Freepik
- Log in to post comments











































