As misinformation continues to rise in Bangladesh, Facebook’s efforts have done little to stem the mounting problems due to inadequate moderation, language barriers and algorithmic limitations.
Illustration: TBS
“>
Illustration: TBS
The proliferation of disinformation about Bangladesh has reached alarming levels, with chief adviser Dr. Muhammad Yunus meeting with Facebook’s parent company Meta earlier this month to urge the tech giant to help the country thwart such campaigns. had to request.
At the meeting, Mehta’s Director of Human Rights Policy, Miranda Cisiones, assured Dr. Yunus that she was wary of social media companies using their platforms to spread disinformation.
But whether this “vigilance” is working, or could work, is a difficult question to answer.
Independent fact checkers and the role of Facebook
Misinformation on Facebook is not new, but disinformation campaigns, the deliberate use of misleading information to achieve a specific goal, can undermine elections and the current volatile political climate. is increasingly used to influence public perceptions around the world.
“There have been many recent changes in the way misinformation spreads through Facebook.In Bangladesh, the interplay of false religious and political claims, along with pseudoscience and conspiracy theories, was initially seen on social media. “It was often seen as the main driver of misinformation through claims such as miracles,” said FactWatch founder Sumon Rahman.
Facebook can’t monitor or control misinformation in every country, so we work with local fact checkers. In Bangladesh, Sumon Rahman’s FactWatch became the first third-party fact-checking organization to partner with Facebook.
“Facebook uses third-party fact checkers to detect misinformation. They are rigorous in their vetting process. They vet all institutions and provide guidelines. We only work with organizations certified by the International Fact-Checking Network (IFCN),” said Sumon. He explained.
The way it works is that certified, independent fact-checkers who are familiar with the local language and culture are given the power to flag comments and posts that they deem to be misinformation.
“Facebook does not manually oversee its fact checkers. Facebook has the power to label something as misinformation. Circulation will decrease and you will get less attention,” Sumon added.
He said Facebook could become involved if a fact checker’s claims are challenged, and if the fact checker makes multiple mistakes, Facebook could stop working with the fact checker. I made it clear.
However, while fact-checking is left to local experts, problems arise due to Facebook’s moderation policies. The lack of tools to properly understand Bangla and Facebook’s reliance on AI tools has led to the nuances of the language being lost.
“The presentation of incorrect information in context often evades detection. To improve moderation, moderation policies need to be updated more frequently and human intervention needs to be increased. I think so,” argued Minhaj Aman, Principal Scientist at Dismislab.
“Another major hurdle is that Facebook has difficulty recognizing repeat offenders. If a post previously identified as misinformation resurfaces, Facebook doesn’t properly recognize it,” Minhaj said. added further.
How can Facebook improve?
According to Sumon, Meta as a corporate entity is unlikely to step in to address national concerns individually, beyond the algorithm, despite requests from Chief Advisor Yunus.
However, Sumon pointed out that Dr. Yunus spoke with Meta’s human rights and public policy team, which is separate from its fact-checking team, and said, “In extreme cases, Facebook can generate incredibly harmful disinformation. You can and do delete them. Facebook keeps the statistics, which is rare.” Perhaps some agreement could be reached on how many claims they hold against various governments. ”
Separately, there are many ways Facebook could better combat misinformation. Minhaj thinks improving algorithms could be one such way. “When people are exposed to fake news, Facebook often suggests similar content, making the problem worse. Fixing the algorithm would solve this problem.”
He was also critical of Facebook for allowing deepfake videos on its platform, despite the company sponsoring deepfake videos as a source of revenue, violating its disinformation policy. claimed to have done so. He believes this needs to be stopped immediately.
Finally, taking a tougher stance against repeat offenders could also alleviate the problem, although “creating a list of repeat disinformation spreaders and issuing warnings about them may be a good option.” , and must be done in a way that avoids censorship.”
disinformation today
In August 2024, Dimislav published a report that found that a network of over 1,300 Facebook bot accounts was used systematically to influence public opinion in favor of the party during the Awami League government. Announced.
Dismislab says the bot network has made more than 21,000 comments on many Facebook posts over the years and was most active just before elections. The bots mainly criticize the opposition parties and make comments by using specific keywords such as EC (Election Commission).
This type of bot behavior falls under what Meta calls Collaborative Abuse (CIB). That’s why Meta has helped remove these accounts many times.
But since the fall of the Awami League government and Hasina’s subsequent flight to India, disinformation operations have reached new heights and become increasingly difficult to counter.
Reports of exaggerated gang violence, attacks, and fabricated stories are at their peak, with Indian media playing a major role in spreading the flames. “Since August 5, there has been an alarming increase in political and religious misinformation, with social media platforms and news outlets in India playing a key role,” Minhaj said.
Minhaj said these media outlets are trying to advance their country’s foreign policy story at the expense of diplomatic relations with Bangladesh.
An analysis of various fact-checking reports targeting Bangladesh revealed that 50% of the total misinformation over the past 11 months occurred between July and now.
Additionally, the main themes of misinformation before the July Uprising were religious glorification and proselytization, which have now declined, making room for an increase in misinformation about communal violence and hatred.
According to the analysis, the dominant theme has now shifted to portraying Bangladesh as a dangerous country for ethnic minorities, especially Hindus.