- People who deliberately spread falsehoods about vaccines on social media may deter large numbers from getting vaccinated, costing lives.
- However, there is divided opinion on whether lawmakers should criminalize such activities in the same way as incitement to violence, for example.
- Opponents argue that such a move would be counterproductive and breach individuals’ right to freedom of speech.
Since the development of the
According to the World Health Organization (WHO), between 2010–2015 alone, vaccines prevented at least
But despite these past successes, misinformation about the new COVID-19 vaccines is rife on social media. Medical News Today recently debunked 13 myths about them, including that:
- they cannot be safe because researchers developed them so quickly
- they can alter DNA
- they contain “location-tracking microchips”
A recent survey in the United Kingdom found that people who obtain health information from social media sources, such as YouTube, are less willing to be vaccinated against COVID-19.
Despite the harm they cause, many people spreading false or misleading information about vaccines do so inadvertently and with good intentions. The falsehoods they spread are referred to by many as “misinformation.”
By contrast, a vocal minority create anti-vaccine content that they know to be false, which is classed as “disinformation.”
Melinda Mills, a professor of demography and sociology at the University of Oxford in the U.K. and director of the university’s Leverhulme Centre for Demographic Science, acknowledges that false information about vaccines comes in many forms.
The groups who spread it range from “anti-vaccine libertarians” who want to protect civil liberties to “concerned parents and health-conscious people,” Prof. Mills writes.
But, she argues: “The freedom to debate, and to allow the public to raise legitimate vaccine concerns to fill the knowledge void, should not extend to causing malicious harm.”
She concludes that governments should consider criminalizing people who “intentionally hurt others” by spreading false information online.
Prof. Mills is a member of the U.K. government’s Scientific Advisory Group for Emergencies and its subgroup, the Vaccines Science Coordination Group.
She notes that France, Germany, Malaysia, Russia, and Singapore have all passed laws against spreading fake news and health disinformation.
In Germany, for example, social media organizations must remove hate speech or fake information within 24 hours or face fines of up to €50m ($60.4m).
Social media companies, such as Facebook, argue that, unlike newspapers, TV, and radio, they are platforms rather than publishers and therefore have minimal responsibility to vet posts. Prof. Mills contends that legislation is consequently needed to force companies to self-regulate and police content.
“We need to decide whether social media companies are publishers,” she writes.
Prof. Mills concedes that criminalization can come at a cost. She cites evidence suggesting that social media companies in Germany have become risk-averse, “curtailing freedom of expression and censoring legitimate material.”
In Russia, she adds, criminalization has stifled criticism of the government. She references a report by Amnesty International, which suggests that new laws criminalizing “public dissemination of knowingly false information” in emergencies could further curtail Russians’ right to freedom of expression.
Regardless of whether they criminalize vaccine disinformation, Prof. Mills calls on governments, scientists, and health authorities to improve their public messaging:
“Instead of passive, information-laden official websites, communications need to reach the people on social media platforms — offering content as engaging as their misinformation counterparts and allowing dialogue. This could include more visual information, memes, emotive stories, multiple languages, and involving local community leaders.”
Jonas Sivelä, Ph.D., a senior researcher at the Unit for Infectious Disease Control and Vaccinations at the National Institute for Health and Welfare in Finland, argues that criminalizing anti-vaccine information would be counterproductive.
“There is no denying that the world would be a better place without misinformation or that it would be in the public interest for anti-vaccination misinformation not to exist,” he writes. “But should it be criminalized? No.”
Dr. Sivelä leads the working group on vaccine hesitancy and uptake for European Joint Action on Vaccination, a consortium of 20 European countries.
He writes that freedom of speech underpins all other human rights, without which there would be “oppression, tyranny, and other extrajudicial practices.”
Dr. Sivelä acknowledges that freedom of speech should be limited in certain cases, for example, if a person or organization incites lawless activities and violence.
“But anti-vaccination misinformation is not such a case,” he adds.
He says people should be allowed to voice concerns about vaccines, just as they can about other public health measures.
“If we label these as criminal, there is a genuine risk of suppressing legitimate concerns and questions, expressed without the intent to spread false information deliberately,” he writes.
He believes criminalization would further erode trust in authorities, governments, and the healthcare system:
“Failing to consider or answer people’s worries, and instead suffocating relevant discussion, would only result in an increased lack of confidence in the long run — and an increase in misinformation.”
Dr. Sivelä argues that there are other ways to tackle misinformation, for example, recent efforts by Facebook and Twitter to fact-check and label such posts.