In the United States, some Facebook (FB) users are receiving a prompt asking if they are concerned that someone they know is becoming radical. Others are being alerted about the possibility of being exposed to extremist information.
It’s all part of a test that Facebook is conducting as part of its Redirect Initiative, which tries to tackle violent extremism.
This experiment is part of a wider project to see how we can provide tools and help to people on Facebook who have interacted with or been exposed to extremist content, or who know someone who is.
“Are you concerned that someone you know is becoming an extremist?” one of the notifications reads, a screen grab of which went viral on social media Thursday.
According to a screenshot shared on social media, the alert stated, “We care about avoiding extremism on Facebook.” “Others in your situation have gotten discreet help,” she says.
The user is then directed to a support page by the alert.
Another alert reads, “Violent groups try to use your anger and disillusionment.” “You have the ability to protect yourself and others right now.”
The user is also redirected to a help page as a result of the notice.
Facebook is sending users to a range of resources, including Life After Hate, an advocacy group that helps people quit violent far-right movements.
Over the last few years, Facebook has been blasted by detractors for failing to take sufficient steps to combat extremist content on its platform. For example, in 2020, the firm was chastised for failing to take down a militia group’s Facebook page, which encouraged armed residents to go to the streets of Kenosha, Wisconsin.
The company has also committed to do a better job of preventing the spread of false information and conspiracy theories. In May, Facebook’s independent oversight board encouraged the corporation to look into the role of its platform in the January 6 uprising.