Our investigation has uncovered that Facebook still fails to detect and root out hate speech inciting violence and genocide against the Rohingya – a Muslim minority group in Myanmar – despite its commitments to better detect Burmese language hate speech.
We submitted eight explicit and violent ads containing real examples of Burmese language hate speech against Rohingya – all of which fall under Facebook’s criteria for hate speech – and Facebook approved all eight ads for publishing. This is especially alarming given Facebook previously admitted in 2018 that it played a role in inciting violence during the genocidal campaign against the Rohingya, which saw thousands killed and nearly 900,000 displaced, while villages were burned to the ground and women and girls were brutally raped.
“It is unacceptable that Facebook continues to be shockingly poor at detecting Burmese language hate speech. If they still can’t do this in Myanmar after five years of supposed efforts, what are the chances that their own voluntary efforts will be enough to avoid contributing to atrocities in Ukraine and other conflict zones,” said Ava Lee, Digital Threats to Democracy Campaign Lead at Global Witness.
After Facebook approved the ads, we removed them before they would be published. The ads were highly offensive and disturbing – filled with dehumanising language and direct calls for killings – so we will not reproduce them here. All of the ads breach Facebook’s own guidelines and most of them would have broken international law had they been published.
Facebook failed to respond to our request for comment on our investigation *.
This investigation comes at a time when Facebook is facing legal action in the US and UK accusing the company of negligence that facilitated the Rohingya genocide, with victims seeking more than $150 billion in compensation.
Additionally, a group of Rohingya youth living in refugee camps in Bangladesh have submitted a complaint against Facebook under the OECD Guidelines in Ireland demanding justice for Facebook’s role in facilitating genocide against their community and seeking funding from Facebook for their education. Frances Haugen, whistleblower and former data scientist at Facebook, has called for Ireland to take action on the OECD complaint.
“Our investigation adds to the evidence that Facebook is
unable to regulate itself. We call upon governments, courts and regulators to
step in and hold Facebook to account for its role in facilitating human rights
abuses,” said Lee.
* Update 28 March 2022: In response to the Associated Press, Facebook said that they have invested in Burmese language technology and built a team of Burmese speakers whose work is informed by feedback from experts, civil society organizations and the UN Fact-Finding Mission on Myanmar’s collective output. Our point remains, however, that Facebook approved the ads with hate speech in them despite having made these improvements and therefore that their current systems are not good enough.