Three people using their smart phones outsideShare on Pinterest
Misinformation about cancer is incredibly common online. Justin Lambert/Getty Images
  • A study investigated the accuracy of cancer treatment information on social media and its potential for harm.
  • The results suggest a third of cancer articles contain misinformation, while most of these carry harmful information.
  • The researchers hope their findings will aid the development of tools and behavioral interventions to counter cancer misinformation online.

According to a recent survey by the Pew Research Center, 72% of adults in the United States use social media. Another survey suggests that 73% of people in the U.S. obtain health-related information from the internet.

Research also shows that false news is more shareable than news that is fact-checked, especially for social media.

A study of social media claims about drugs and dietary supplements circulated on WhatsApp found that 86.4% of them were either false (27.3%) or “potentially misleading” (59.1%), with claims about the latter being most shared.

The spread of health misinformation can hinder the delivery of evidence-based medicine and negatively affect patient-doctor relationships. It also has links to an increased risk of death.

Some experts suspect that social media might influence patient decisions for cancer treatment. However, there is little data on the quality of cancer treatment information available on social media.

Recently, scientists from the University of Utah in Salt Lake City led a cross-country collaboration to examine the accuracy of cancer treatment information on social media and its potential for harm.

They found that a third of the most popular cancer articles on social media contain misinformation, with the majority of these articles carrying harmful information.

“When conducting the study, I was unsure of what to expect,” Skylar Johnson, assistant professor in the Department of Radiation Oncology at the University of Utah’s Huntsman Cancer Institute, and lead author of the study, told Medical News Today.

“My fears were confirmed when the data suggested that many articles shared on social media contain misinformation and harm. The surprising finding was that this type of information was more likely to receive increased online engagement when compared to factual and safe information,” he noted.

“It is essential that we address misinformation from multiple areas, including on social media, with patients, and with providers. It is our hope that this information could be used to help inform future social media health policy surrounding health information on social media and the need to amplify high quality, accurate and safe information,” he added.

The researchers have published their findings in the Journal of the National Cancer Institute.

The scientists used BuzzSumo, a web-scraping software, to gather the 50 most popular English language articles for each of the four most common cancers: breast, prostate, colorectal, and lung.

They included articles and blog posts shared on Facebook, Reddit, Twitter, and Pinterest between January 2018 and December 2019. Of the 200 pieces the software collected:

  • 75 came from traditional news outlets
  • 83 came from nontraditional digital outlets
  • two came from personal blogs
  • six came from crowd-funding sites
  • 34 came from medical journals

The researchers selected two National Comprehensive Cancer Network panel members to rate the articles for misinformation and harmful information. The panel members also included descriptions of the reasons behind their ratings.

After performing statistical analysis of the data, the researchers found that 32.5% of the articles contained misinformation. This mostly came from misleading titles, misuse of evidence, and unproven therapies.

They also found that 30.5% of articles contained harmful information. These articles mostly urged people to delay or not seek out medical attention for curable conditions, pay for expensive therapies, self-medicate with potentially toxic substances, or use alternative therapies that could interact adversely with other treatments.

The scientists also discovered that articles with harmful information received an average of 2,300 shares, while safe articles received 1,500 shares. And while Facebook, Reddit, and Twitter engagements had links with misinformation and harm, Pinterest engagements had associations with neither.

The study did not evaluate why misinformation exists or why individuals share it more. However, other research has found that often, sharing misinformation stems from a lack of attention to verify information, as opposed to a desire to share falsehoods.

Other studies have found that people may share misinformation as it aligns with deeply held false beliefs.

A study focusing on health misinformation found that people who distrust the healthcare system and have a favorable view of alternative treatments are more likely to believe health-related misinformation.

Johnson added that some may be interested in “too good to be true” treatments, as they could give those who are vulnerable some measure of hope. People may also feel privileged by having access to information that is not mainstream. All of these reasons, he says, may then receive amplification by social media algorithms.

The researchers conclude that their findings could help lay the foundation for patient-specific tools and behavioral interventions to counter cancer misinformation online.

However, they wrote that their results are limited, as they only examined articles in English. They added that data from BuzzSumo may not completely match that from social media platforms and that their data lacks important qualitative information.

They note that researchers need to conduct further studies to understand who engages with cancer misinformation, its impact on scientific belief, trust, and decision-making, and the role of doctor-patient communication in correcting misinformation.

The team is also in the process of creating a database to identify article-specific features linked to misinformation.

“This study demonstrates why it is important for people to refer to information produced by reputable organizations, or speak to their doctor when looking for information about cancer,” said Martin Ledwick, Cancer Research UK’s head cancer information nurse, who was not involved in the study, in a recent interview with MNT.

“At Cancer Research UK, our online information in the About Cancer Section of our website is written by cancer nurses and reviewed by experts. Our online community Cancer Chat is fully moderated to keep it safe and to identify and remove any misinformation that might be posted on it,” he added.

“This was a rigorous, well-designed study to assess the quality of cancer treatment information on popular social media platforms, “ Dr. Deborah Doroshow, Ph.D., assistant professor of medicine at the Icahn School of Medicine at Mount Sinai, NY, who was not involved in the research, told MNT. She continued:

“These results suggest that social media companies may be promoting the spread of potentially harmful misinformation; I would argue that they have a duty to identify such sources when they are presented on their platforms because our patients’ lives are at stake.”

“The results, sadly, are not a surprise to those of us working within this field,” Dr. Joel Newman, consultant hematologist in the East Sussex Healthcare branch of the National Health Service (NHS) in the United Kingdom, who was not involved in the study, told MNT.

“Our patients naturally have questions and wish to research ways to make themselves better or improve their chances with their cancer treatment. However, ‘research’ these days involves a quick Google search or a trawl through social media, neither of which constitutes scientific research.”

“The best way to gain information on diagnosis and treatment is to talk to your medical professionals — doctors and specialist nurses, or through reputable cancer charities who are able to provide independent and reliable information and support.”