Lawyers in both US and UK sued Meta Platforms Inc (new name for Facebook) on behalf of Rohingya refugees on Monday 6 December over allegations for failing to prevent its social media platform being used for hate speech and incitement to violence against the Muslim group living in Myanmar who became targets of ethnic cleansing amounting to genocide.

Earlier this year the Australian Muslim Advocacy Network (AMAN) lodged a complaint against Facebook, through the Australian Human Rights Commission alleging that the firm breached discrimination and hate speech law under the Racial Discrimination Act. Facebook has also been accused of having bias against Palestinians where it established a special operations unit to censor, filter, block and remove any content that exposed the truth about the human rights abuses and ethnic cleansing of Palestinians. 

The class-action complaint against Facebook, filed in California on Monday 6 December accuses the firm’s failures to remove hateful content and propaganda  where its platform’s design contributed to brutal violence against the Rohingya community.

Facebook in August 2018 acknowledged that its platform was used to “foment division and incite offline violence” on a wide scale and began deleting and banning accounts of key individuals and organisations in Myanmar.

Myanmar’s military launched a brutal campaign in 2017 to push Rohingya Muslims out of the Rakhine state resulting in almost a million Muslim men, women and children driven out through rape, murder and razeing of villages that  UN called a “textbook example of ethnic cleansing.”

Mynmar’s nationalist monks and top government officials posted and recirculated propaganda against the Rohingya, while spreading falsehoods and doctored images that suggested some Rohingya burned their own villages and then blamed it on Myanmar security forces.

Under US law, Facebook is largely protected from liability over content posted by its users. But the new lawsuit argues the law of Myanmar – which has no such protections – should prevail in the case.

British lawyers also submitted a letter of notice to Facebook’s London office soon after the US action alleging:

  • Facebook’s algorithms “amplified hate speech against the Rohingya people”
  • The firm “failed to invest” in moderators and fact checkers who knew about the political situation in Myanmar
  • The company failed to take down posts or delete accounts that incited violence against Rohingya
  • It failed to “take appropriate and timely action”, despite warnings from charities and the media

The case refers to the Facebook whistleblower Frances Haugen, who leaked a number of internal documents earlier this year, that the company does not police abusive content in countries where such speech is likely to cause the most harm.