$0

No products in the cart.

Categories:

Facebook asks: Are your friends becoming extremists?

Bookmark (0)
To login to your account click here.
HomeLatest Platform NewsSocial NetworksFacebook asks: Are your friends becoming extremists?

Facebook Inc is starting to warn some users they might have seen “extremist content” on the social media site, the company has said.

Screenshots shared on Twitter showed a notice asking “Are you concerned that someone you know is becoming an extremist?” and another that alerted users “you may have been exposed to harmful extremist content recently.” Both included links to “get support.”

The world’s largest social media network has long been under pressure from lawmakers and civil rights groups to combat extremism on its platforms, including US domestic movements involved in the Jan. 6 Capitol riot when groups supporting former President Donald J Trump tried to stop the US Congress from certifying Joseph Biden’s controversial victory in the November election.

Facebook said the small test, which is only on its main platform, was running in the United States as a pilot for a global approach to prevent radicalization on the site.

“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,” said a Facebook spokesperson in an emailed statement.

“We are partnering with NGOs and academic experts in this space and hope to have more to share in the future.”

It said the efforts were part of its commitment to the Christchurch Call to Action, a campaign involving major tech platforms to counter violent extremist content online that was launched following a 2019 attack in New Zealand that was live-streamed on Facebook.

Related article:
Russia fines Twitter for not deleting banned content, says court

Facebook said in the test it was identifying both users who may have been exposed to rule-breaking extremist content and users who had previously been the subject of Facebook’s enforcement.

The company, which has tightened its rules against violent and hate groups in recent years, said it does remove some content and accounts that violate its rules pro-actively before the material is seen by users, but that other content may be viewed before it is enforced against.

Critics of the company will claim that the left-leaning platform giant is now flexing its muscles to control debate and the public square. However, the company is in a strategically tough position where it is damned if it does and it is dammed if it doesn’t.

The team at Platform Executive hope you have enjoyed the ‘[post_title]’ article. Automatic translation from English to a growing list of languages via Google AI Cloud Translation. Initial reporting via our official content partners at Thomson Reuters. Reporting by Elizabeth Culliford in New York. Editing by Kenneth Li and David Gregorio. Comment by Rob Phillips.

You can stay on top of all the latest developments across the platform economy, find solutions to your key challenges and gain access to our problem-solving toolkit and proprietary databases by becoming a member of our growing community. Platform Executive has two membership tiers, Community (FREE) and Premium ($195 per year), which offer different levels of access to our products and services. What are you waiting for?

Related article:
Russian lawmaker floats idea of Zoom ban after it halts sales to state institutions