Facebook has said it has removed over three dozen pages spreading misinformation about COVID-19 vaccines, after the White House called on social media firms to tighten controls on pandemic-related facts shared on their platforms.
Companies like YouTube, Twitter and Google have come under fire from the Biden administration for the alarming spread of vaccine misinformation that is hitting the pace of inoculation in a country where many are hostile to being vaccinated.
A recent report from the Center for Countering Digital Hate (CCDH) showed 12 anti-vaccine accounts are spreading nearly two-thirds of anti-vaccine misinformation online.
Facebook disputed the methodology behind the report, but said on Wednesday it removed over three dozen pages, groups and Facebook or Instagram accounts linked to these 12 people for violating its policies.
“We have also imposed penalties on nearly two dozen additional Pages, groups or accounts linked to these 12 people,” Facebook said in a blogpost titled “How We’re Taking Action Against Vaccine Misinformation Superspreaders”.
Some of the main pieces of vaccine misinformation the Biden administration is fighting include that the COVID-19 vaccines are ineffective, false claims that they carry microchips and that they hurt women’s fertility, a White House official had said last month.
The team at Platform Executive hope you have enjoyed the ‘Facebook removes dozens of vaccine misinformation ‘superspreaders’‘ article. Automatic translation from English to a growing list of languages via Google AI Cloud Translation. Initial reporting via our official content partners at Thomson Reuters. Reporting by Eva Mathews in Bengaluru. Editing by Devika Syamnath.
You can stay on top of all the latest developments across the platform economy, find solutions to your key challenges and gain access to our problem-solving toolkit and proprietary databases by becoming a member of our growing community. For a limited time, our subscription plans start from just $16 per month. What are you waiting for?