Polio eradication action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
2 minutes
Read so far

Characteristics of Antivaccine Messages on Social Media: Systematic Review

0 comments
Affiliation

University of Warsaw

Date
Summary

"Considering the activity of the antivaccination movement's supporters on social media and how easily they can communicate their messages that are not scientifically confirmed to a large number of recipients, it is crucial to learn and understand their activities and messages."

The spread of negative information about vaccination on the internet and social media is considered to be the leading cause of vaccine hesitancy. This study aims to gather, assess, and synthesise evidence regarding the current state of knowledge about antivaccine social media users' web-based activities.

The researchers conducted a scoping literature search of papers published between January 1 2015 and December 31 2019. In the end, 18 articles were included in the qualitative synthesis. Most of the studies were based on Twitter data, even though Facebook, YouTube, and Instagram have many more active users. The reason for this disproportionate attention may lie in the simplicity of gathering data from Twitter.

The review found that the number of articles analysing antivaccination messages on social media has increased over the last 5 years. The studies dealt with the popularity of provaccination and antivaccination content, the style and manner in which messages about vaccines were formulated for the users, a range of topics concerning vaccines (harmful action, limited freedom of choice, and conspiracy theories), and the role and activity of bots in the dissemination of these messages in social media.

Selected findings:

  • Regardless of the social media platform, there are similarities in the characteristics of antivaccine content. Most of the authors found that vaccine-related messages with negative sentiments had a higher number of positive reactions on social media (likes, shares, and retweets). This relationship was particularly evident on YouTube and Instagram, whereas the results from the studies on Twitter and Facebook were inconclusive. The high number of likes on and shares of antivaccine content poses the danger that ordinary users will find this information more easily and consider it to be more reliable than provaccine messages.
  • Antivaccine users create messages in a user-friendly manner. They publish emotional personal stories using direct language. Research shows that emotional stories attract the attention of neutral users, and the stirring up of fear of vaccinations leads to the inaction of the audience.
  • Antivaccine messages often contain conspiracy theories. Evidence shows that clarifying parental concerns and involving parents in decisions regarding their child's vaccination can reduce beliefs in conspiracies.
  • Bots on social media spread not only antivaccine messages but also provaccine messages. Benign bots respond automatically, aggregate content, and perform other useful actions. However, malicious bots are designed to manipulate, mislead, and exploit to influence social media discourse. "Public health authorities should not only monitor social media, detect negative bots, and fight the spread of the antivaccine content, but they should also use benign bots to communicate with the public and dispel doubts about vaccinations."
  • Web-based platforms differ in terms of how easily the antivaccine content is spread through social media. Research shows that since 2016, interactions with content containing misinformation have reduced on Facebook but have continued to increase on Twitter. This suggests that misinformation on Twitter can become a bigger problem than on Facebook. Furthermore: "YouTube facilitates the spread of misinformation to millions of viewers."
  • In the papers studied, the human papillomavirus (HPV) vaccine was the second most common topic, after the topic of vaccines in general. This topic is popular in the discourse on antivaccine movements due to some specific features that make it vulnerable to theories that discourage vaccinations.

Future research directions are proposed, such as that a multilingual comparative study be undertaken to explore the similarities and differences in the vaccine-related discourse on social media between countries.

In conclusion: "Public health authorities should continuously monitor social media to find new antivaccine arguments quickly and, based on that, design information campaigns targeting health professionals and ordinary users who are at a risk of being misinformed. Social media platforms have a big responsibility because they give millions of users access to misinformation. Knowledge of the characteristics of antivaccine content can help in the creation of tools that automatically tag false information. A positive trend in recent years is that social media platforms have attempted to stop the spread of vaccination misinformation."

Source

Journal of Medical Internet Research 2021;23(6):e24564) doi: 10.2196/24564. Image credit: Freepik