Experts fear that Meta’s decision to remove professional fact-checkers from Facebook will further worsen the radicalization of so-called boomers in the UK.
Even before what Keir Starmer described as a “far-right riot” in Britain last summer, alarm bells were raised amid concerns that older people are more susceptible to misinformation and radicalization than younger “digital natives”. was being sounded.
An analysis of hundreds of defendants by the Guardian found that as many as 35% were over 40, making suspects generally older than those charged in the 2011 riots.
But after Mark Zuckerberg announced last week that Meta would replace fact-checkers with a crowd-sourcing system and recommend more political content, the potential on Facebook, the social media platform of choice for many older people, is rising. There are new concerns about the risk of radicalization.
“It’s clear that this is a reversal with all sorts of risks,” says Des Midge, principal investigator of a pioneering European-wide project called Smidge (Social Media Narratives: Tackling Extremism in Midlife).・Dr. Sarah Wilford of Montfort University said.
“X may be a model for the crowdsourced ‘community notes’ approach that Meta appears to be taking in place of professional moderators, but it does not operate in small silos or closed groups. It probably wouldn’t work the same way on Facebook, where there are a lot of people. . They fear that discerning the truth may be even more difficult for middle-aged Facebook users who are at risk of being exposed to extremist content. ”
Anti-extremism campaign group Hope Not Hate also told the Guardian that Mr Zuckerberg’s announcement does not bode well for far-right figures and groups such as Tommy Robinson and Britain First to be allowed back on Facebook. He said he was concerned about this.
Britain First proved particularly adept at using the platform before it was banned, amassing 2 million likes, at that stage leading to Labor (1 million) and Conservatives (650,000). ) exceeded.
Young people continue to make up the majority of perpetrators of crime. But even before the riots, discussions about the radicalization of boomers had already been sparked by the then-48-year-old Darren Osborne, who was jailed in 2018 for a deadly terrorist attack on a mosque in Finsbury Park, north London. It was caused by such an incident. The judge became “rapidly radicalized” online.
Another man, Andrew Leake, who firebombed the Dover Immigration Center in 2022 in what police described as an “extreme right-wing” attack, then killed himself but left behind an internet history riddled with racism. , was 66 years old.
When it comes to riots, Hope Not Hate said Facebook, as opposed to other platforms, is used in a particular way by the far right. “The telegrams were meant to incite extreme hatred, and sometimes even conspiracies and schemes, and X was used to spreading that message,” said Joe, head of research at the anti-racist group. Mulhall says.
“Back then on Facebook, it was common to see groups creating hyper-local, targeted content with pages popping up around specific events. We have seen anti-immigrant protest Facebook groups play a very important role in organizing the targeting of asylum centers.”
The users of such pages often skew older, in line with the widespread use of Facebook. An Ofcom report last year found that overall, social media users were most likely to cite Facebook as their main social media site (48%), but that “this This was brought about by the huge popularity of “. It also warned that older people are less likely to recognize fake social media profiles.
Wilford said her research has shown that some older Facebook users are less likely to be fact-checked for reasons such as a reluctance to fact-check and a tendency to trust online content that is packaged like traditional news output at face value. This suggests that they are particularly vulnerable.
“We’re also talking about an invisible generation of people who sometimes look back at lives that may not have turned out the way they wanted, including their jobs and social situations,” she added. “But when they go online and interact with other Facebook users, they are embracing an echo chamber that makes them feel good. They are listened to and find validation.”
The issue of misinformation permeating Facebook groups dedicated to everyday life has drawn a response from some councils, which are putting resources into training lay members who moderate local community groups. There is.
But shocking political events in the UK in recent years have changed the Facebook experience for many people whose first innocent interactions might have been sharing photos with family or posting neighborhood news. It completely changed.
Brexit, Trump’s 2016 victory, and the coronavirus pandemic have all contributed to Facebook It is said to have served as a catalyst for engagement with more extreme forms of right-wing politics through .
“Facebook is a key site where people algorithmically encounter these harmful ideas in their daily social media usage habits.Meta uses more, not less, to combat this harm. “You should do that,” she said.
“Zuckerberg’s comments and Mehta’s new position on this issue only reinforce a misplaced sense of victimhood among people who hold anti-progressive views that research shows lead to radicalization. I guess.”
When asked to comment on concerns about misinformation and extremism, Mehta referred to the blog post and said the “complex system” for moderating content was “overreached.”