Facebook has said it would eliminate false claims about COVID-19 vaccines that have been debunked by public health specialists, following a similar announcement by YouTube in October.
The move expands Facebook’s present rules against falsehoods and conspiracy theories about the pandemic. The social networking firm states it takes down coronavirus misinformation which poses a risk of “imminent” harm, while tagging and reducing distribution of additional false claims which fail to achieve that threshold.
Facebook stated in a blog post that the worldwide policy change came in reaction to news that COVID-19 vaccines will shortly be rolling out around the world.
Two drug companies, Pfizer and Moderna, have asked US authorities for emergency use authorization of the vaccine candidates. The UK approved the Pfizer vaccine on Wednesday, leaping before the rest of the world in the race to begin the most crucial mass inoculation program in history.
Misinformation concerning the new coronavirus vaccines has proliferated on social networking during the pandemic, such as through viral anti-vaccine articles shared across multiple platforms and by different ideological groups, according to researchers.
A November report by the non-profit First Draft discovered that 84 percent of interactions created by vaccine-related conspiracy material it studied came from Facebook pages and Facebook-owned Instagram.
“This could include false promises about the safety, efficacy, ingredients or side effects of the vaccines. By way of instance, we’ll eliminate false claims that COVID-19 vaccines include microchips,” the company said in a blog post. It said it would update the claims it removes based on evolving guidance from public health authorities.
The social networking firm has removed misinformation regarding other offenses under its coverage of deleting content that risks imminent harm. It previously eliminated vaccine misinformation in Samoa where a measles outbreak murdered dozens late last year, and it removed false claims about a polio vaccine drive from Pakistan which were contributing to violence against health workers.
Facebook, that has taken measures to surface authoritative information regarding vaccines, stated in October it would also ban ads that dissuade people from accessing vaccines. Recently, Facebook removed a notable anti-vaccine page and a large private group – one for repeatedly breaking COVID misinformation principles and another for promoting the QAnon conspiracy theory.
The team at Platform Executive hope you have enjoyed this news article. Initial reporting via our official content partners at Thomson Reuters. Reporting by Katie Paul and Elizabeth Culliford. Editing by Nick Zieminski.
Stay on top of the latest developments across the platform economy and gain access to our problem-solving tools, proprietary databases and content sets by becoming a member of our community. Premium subscription plans start from just $7 per month.