Facebook develops plan to regulate vaccine misinformation on its platform

Facebook+Vaccine+Misinformation

Anthony Quintano | Wikimedia Commons

Rachel Dalloo

Months after Facebook Inc.  released updated policies on battling the spread of misinformation during last year’s election, and the tech giant is now taking another step in stopping the spread of misinformation when it comes to COVID-19 and vaccines.

In the hunt to put an end to the spread of misinformation on their platform, Facebook has announced it will be removing any posts that push false claims and narratives, according to a blog post published by Facebook.

“Health officials and health authorities are in the early stages of trying to vaccinate the world against COVID-19, and experts agree that rolling this out successfully is going to be helping build confidence in vaccines,” Facebook’s Head of Health Kang-Xing Jin said.

The social media company went on to note it will enforce its new policies and remove false content “regardless of whether it’s already been posted or is posted in the future,” a spokesperson for Facebook said to Vox’s Recode.

Over the past year, social media platforms, including Instagram, Facebook and Twitter, have created gateways for the spread of false information and conspiracy theories onto their platforms.

“If there’s misinformation out there, it’s very important that if it starts to gain traction, it’s debunked in a very timely way,” Julie Leask, professor of medicine at the University of Sydney and among Australia’s leading experts on vaccine hesitancy, said.

“If news organizations and the voices of science and advocacy for vaccination can’t access audiences to debunk misinformation, then other voices will fill that gap.”

Now these platforms are gearing up to battle and remove claims about the coronavirus and the vaccines that have discredited public health experts.

“You can do these takedowns but that hasn’t necessarily stopped the flow of misinformation, and we can’t forget about the long tail of misinformation,” Rory Smith, research manager at First Draft, said. “There are all of these hundreds of thousands or millions of posts that might not get that many interactions but collectively make up a lot of misinformation.”

In recent months, Facebook has repeatedly faced criticism for allowing false information to remain on its platform for a long period of time without any proper regulation.

Back in December 2020, the platform had allowed a conspiracy theory video to go viral. After the video received widespread attention, Facebook failed to remove the pages affiliated with the anti-vaccine activist who posted the video and continued to create new accounts after being banned on multiple occasions, according to the Guardian.

Aside from vaccine and COVID-19 misinformation, Facebook has also allowed election misinformation to make its way onto their platform as well.

Back in 2016, Facebook launched its first campaign against misinformation when it was found Russia interfered with the U.S. election that year. Facebook has since joined forces with major fact-checking organizations in order to double down on misleading posts.

Leading up to the 2020 general election, Facebook continued to flag posts and comments that contained falsehoods, especially posts that were coming from then President Donald Trump, as stated by CBS News.

For a very long time, disinformation has caused great division among many from both sides of the political spectrum. Critics have voiced their concerns, stating that the moves made by Facebook, Twitter and YouTube were too little, too late, according to The New York Times.  

“For four years you’ve rationalized this terror. Inciting violent treason is not a free speech exercise,” tech investor Chris Sacca wrote to Jack Dorsey and Mark Zuckerberg. “If you work at those companies, it’s on you too. Shut it down.”

Tech companies do have the higher power to stop the spread of misinformation and protect the credibility of elections, even if it means holding the highest in the land accountable for their actions.