Misinfodemic: When fake news costs lives


RelatedArticlesModule – related

The fake news that costs lives – and what to do about it.

Epidemics have to start somewhere. In December 2013, one began under a tree full of bats in West Africa, in a little boy called Emile Ouamouno. When he caught a fever, his pregnant mother Sia nursed him, her three-year-old daughter Philomène close by.

The little boy, who once loved to listen to his father’s radio, died within a few days. And then his mother and sister did, too. The 18-month-old will be forever known to medical history as the “index case” (the first documented patient) in the 2014 West African Ebola epidemic, the most widespread outbreak of this virus in history. It ended up claiming more than 11,000 lives, nearly all of them in Guinea, Liberia and Sierra Leone, although WHO has warned this is an underestimate.

The virus killed so many front-line medical staff that the inability to treat other medical needs caused a savage additional death toll. Untreated malaria, for example, killed an estimated 10,000. 

During the crisis, another virus exploded, complicating local and international efforts to fight Ebola. This one infected millions. Following news of a case in the US, mentions of Ebola on Twitter leapt from about 100 per minute to more than 6000. Rumour became fact. Ebola was in the air, it was everywhere. Online “experts” flourished, sharing “cures” like homeopathy, coffee, raw onions and saltwater. In the West, the misinformation caused terror. In Africa, it cost lives.

Just like a physical virus, each infected person could infect many more. Some infect so many, they are called “super-spreaders”. And a “misinfodemic” begins with a single “index” post. It turns out that tweets and Facebook posts can spread illness more effectively than a virus-laden doorknob.

A study in the science journal Nature, published in November, analyses 14 million tweets. “Social bots” (fake accounts) are responsible for “a disproportionate role in spreading articles from low-credibility sources”, the authors conclude. “Around the world,” write researchers Nat Gyenes and An Xiao Mina in The Atlantic, “digital health misinformation is having increasingly catastrophic impacts on physical health.”

They note that much of today’s global “vaccine hesitancy” can be traced to one “index case” – a single retracted paper. “The lead scientist of the original piece was in the process of filing a patent for an alternative measles vaccine, and he led a campaign to link the competing measles-mumps-rubella [MMR] vaccine to autism. The article he published is now widely recognised to have been the result of fraud. His medical licence has been revoked, but the virus his article produced has continued to infect our information channels.”

Thanks to his misinformation, measles – more contagious than Ebola – is roaring back into unprotected populations. By June 2018, reports The Guardian, 41,000 people caught the measles virus in Europe, nearly twice the 2017 figure. Between January and June 2018, 37 of those people died.

Closer to home, conspiracy theories about water fluoridation rage online. A recent story in the Sydney Morning Herald describes the price being paid by those living in areas without fluoride. In an article headlined “Two towns are 48km apart. One has twice as much tooth decay”, writer Julie Power visits the unfluoridated town of Oberon, where children are hospitalised for tooth extractions at a sharply higher rate than those who live in nearby Bathurst, which has fluoride in its water. Researchers from the University of Sydney found the risk of rotten teeth to be “significantly higher” in children without fluoride in their town supply. A dental therapist who works in both Oberon and Bathurst told a community meeting in Oberon of her experiences.

“The difference out here working is that you see twice as much tooth decay in children, twice as many extractions. That is a comparison we can make weekly in Bathurst and Oberon.”

Despite easily available online research of high quality, and vocal evidence, a young community worker told Power that most locals opposed fluoride. “I’ve not found anything good [information about fluoride] that you can get from a simple Google search.”

Dr Helen Petousis-Harris is a vaccinologist and senior lecturer in the Department of General Practice and Primary Health Care at the University of Auckland. She also sits on the World Health Organisation Global Advisory Committee on Vaccine Safety (GACVS).

She says she is noticing an increasing amount of “fake science” spread online by pseudo-academics who are designed to appear plausible to a non-expert. “This is becoming important, along with the money trail, as the modern anti-vaccine movement is well financed, lawyered up and incentivised.”

Deliberate misinformation campaigns are followed by a rise in infectious diseases such as whooping cough and measles. “There is evidence that the damage [online misinformation can cause] can sometimes not be reversed. I also fear the division between people that these forums cause.”

Petousis-Harris would like to see action from the government, not to mention public health authorities. “My frustration burns. The solution to this growing problem requires digitally savvy PR strategies, not well-meaning health professionals,” she told North & South.

She is not alone in calling for the government to wake up to a changing information ecosystem. Dr Paul Ralph, a senior lecturer in computer science at the University of Auckland, recently wrote a piece for The Conversation criticising Facebook for a “pathetic” attempt to deal with the torrent of “fake news” it hosts. What is required, he says, is a filter – “something like a spam filter for lies”.

He suggests taxing Facebook and Twitter to pay for the research. “Government needs to play hardball with the corporations that are damaging our society. Unfortunately, New Zealanders seem reluctant to negotiate effectively.” 

The Atlantic’s Gyenes and Mina make an eloquent case for collaboration between public health pooh-bahs and social influencers. They’d be following in the dancing shoes of the Liberian rapper Shadow, who made a catchy song for YouTube called “Ebola in Town” during the 2014 epidemic. It warned against kissing and touching, was seen by more than 100,000 viewers, and “went viral” in a good way. Without doubt, it saved lives.

“Disease spreads when people cluster in digital spaces,” write Gyenes and Mina. “We know that memes – whether about cute animals or health-related misinformation – spread like viruses: mutating, shifting, and adapting rapidly until one idea finds an optimal form and spreads quickly.

“What we have yet to develop are effective ways to identify, test and vaccinate against these misinfo-memes. Until that happens, we should expect more misinfodemics that [foster] outbreaks of measles, Ebola and tooth decay, where public health practitioners must simultaneously battle the spread of disease and the spread of misinformation.”

Sounds like just the job for a chief technology officer.

Tips for detecting fake science

  1. Who is making the claim? A qualified expert will have a doctoral degree in a relevant field, and be employed in that field. A medical doctor will be trained and practising in the field they are talking about. “Dr” in front of their name isn’t enough.
  1. Check the source. Have you heard of the site before?
  1. If you’re not sure about a source, check to see if anyone else is carrying the story. Can you find it elsewhere at a source you trust?
  1. Do the graphs make sense? A small range on the y (vertical) axis can turn ho-hum data points into a spike.
  1. If the story mentions incredible new research, it is sometimes amusing to check out the paper the story refers to. You might find the conclusion doesn’t match the hyperbolic headline.
  1. Remember, one study does not make a fact. Experimental conclusions are examined during the peer-review process, and debated after publication. Experiments are repeated, using different approaches. A meta-analysis of a large number of studies (including why studies were included or excluded) has much more predictive power than one study.
  1. What do global authorities say? For medical topics, check the World Health Organisation (WHO) or New Zealand Ministry of Health, or websites such as the US National Institutes of Health’s nih.gov, or Mayo Clinic’s mayoclinic.com.
  1. Bad grammar, a poorly made website, or headlines making big claims aren’t good signs. If the headline declares you’ll “never believe” the content, you can often take them at their word.tips for using social media

 Tips for using social media

  1. Delete your Facebook and Instagram accounts today. You’ll be happier in a week.
  1. If you can’t bring yourself to delete your accounts, unfollow everyone who isn’t a personal friend. Unfollow everyone who talks politics. Read traditional news instead.
  1. When you see fake news on Facebook, report it: facebook.com/help/572838089565953.
  1. When someone you care about shares fake news, send them an instant message about it. DO NOT like, share or leave a comment. All such engagement increases the reach of the fake news.
  1. On Twitter, do not quote-tweet fake news – this actually increases the reach of the fake news, even if you’re saying it’s fake. Twitter doesn’t have an option to report fake news, but RealTwitter.com will redirect you to Twitter with filters to turn off some of Twitter’s garbage. Use an adblocker, such as uBlock Origin.

– Tips from Dr Paul Ralph, senior lecturer in computer science at Auckland University.

This article was first published in the January 2019 issue of North & South.

Follow North & South on Twitter, Facebook, Instagram and sign up to the fortnightly email.

AddSearch Reports

Leave a Reply

Your email address will not be published. Required fields are marked *