Facebook is now testing an alert that asks users whether they think their friends are turning into extremists. The company confirmed the test after social media users started posting screenshots of such alerts on Twitter.
Facebook, Twitter, and Google have for quite a time been under pressure to remove extremist content from their respective platforms to prevent violence from spilling into the actual world, but that intention intensified this year due to the increased investigation for the part their platforms acted in the amassing of the violent riots at the US Capitol back in January.
Read more: Facebook inaugurates financial education initiative for women-led businesses in Pakistan
Facebook is now signposting help if you think a friend is becoming an extremist ?
h/t @disclosetv pic.twitter.com/7L5B0UORzj
— Matt Navarra (@MattNavarra) July 1, 2021
Furthermore, this pilot program is a section of Facebook’s Redirect Initiative, which intends to fight violent extremism on the platform by redirecting people who search for violence-related or hate terms toward scholarly support and outreach crowds.
“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,” stated a Facebook spokesperson in a statement. “We are partnering with NGOs and academic experts in this space and hope to have more to share in the future.”
Read more: Facebook rolls out standalone newsletter platform ‘Bulletin’
In addition, Mark Zuckerberg, the CEO of Facebook has been repeatedly questioned by US Congressional audiences concerning the firm’s actions to oppose extremism on its platforms, particularly after the January 6 disorders when the followers of former president Donald Trump entered the Capitol and attempted to stop the Congress from declaring Joe Biden’s triumph in the 2020 elections for the presidency.
Facebook has tightened its rules against violent and hostile groups in recent years, the company also confirmed removing some content and accounts that violate its policies pro-actively before the content is viewed seen by users.
Source: Business Insider