Meta and algorithms of violence
text size

Meta and algorithms of violence

Friday was not only the six-year anniversary of the Rohingya's darkest day. It also marked the coming into force of key provisions of the Digital Services Act (DSA) -- the EU's landmark new legislation governing the Big Tech industry. This law contains significant constraints on Big Tech, including minimum safety standards for algorithmic recommender systems. If properly enforced, it has the potential to prevent any recurrence of what happened to the Rohingya.

I have absolutely no doubt that Facebook's dangerous algorithms -- hard-wired to drive "engagement" and corporate profits at all costs -- actively fanned the flames of hate, ultimately contributing to mass violence and the forced displacement of most of Myanmar's Rohingya population into neighbouring Bangladesh six years ago.

In the years and months leading up to these atrocities in 2017, Facebook became an echo chamber of hate and incitement targeting the long-persecuted minority group. And this happened in a context where "Facebook [was] the Internet", according to a UN investigation.

What's more, The Facebook Papers, leaked by whistleblower Frances Haugen in 2021, highlighted the inner workings of the company -- one stunning revelation after another. These leaks made it clear that Meta had long been aware that its algorithms were responsible for disproportionately spreading hate and disinformation, and that its business model was fueling serious real-world harms, particularly in communities affected by conflict.

It was evident, even when presented with this information, that the company had maintained its 'business as usual' approach. The leaks also revealed that Meta's narrative about the company's supposedly passive role in Myanmar did not hold true. This realisation prompted us at Amnesty International to launch an investigation into the company's role in the ethnic cleansing of the Rohingya.

Last year, Amnesty published the findings of this investigation. It revealed that Meta had dramatically underplayed the true nature and extent of its contribution to the suffering of the Rohingya and found that, far from being a neutral actor faced with an unprecedented crisis, Meta was an active contributor to the horrors faced by the Rohingya.

We can now authoritatively conclude that the algorithms which power the Facebook platform fueled the spread of hate and violence like wildfire, proactively pushing content which incited violence, and disproportionately amplifying the most inflammatory content in the lead up to the horrors of 2017.

Meanwhile, as its algorithms fanned the flames of hate, Meta staff ignored repeated warnings from human rights activists, academics, and other experts. Between 2012 and 2017, senior Meta staff received at least 15 direct warnings stating that the Facebook platform risked contributing to an outbreak of mass violence against the Rohingya.

There is little doubt: Meta contributed to serious human rights violations and, therefore, has a responsibility under international human rights law to provide reparations to the Rohingya.

We are calling for the Rohingya to be compensated and for Meta to take steps to ensure this never happens again by altering its business model.

We presented our findings to Meta, tens of thousands of people joined our campaign and yet, so far, little has changed. Meta's business model remains hardwired for engagement above all else.

Meta, one of the wealthiest companies on the planet, has even refused the community's modest request to provide $1 million (35 million baht) as partial reparations for an education fund for displaced Rohingya youth struggling to realise their potential in the sprawling refugee camps of Cox's Bazar. Meta does not, so they say, engage in "philanthropic activities". But this was no request for charity; it is about Meta failing to fulfil its human rights responsibilities.

Despite the power and wealth wielded by Meta, the Rohingya community has refused to give up hope, and remain steadfast in their determination to secure accountability from the company. Amnesty International will stand with the Rohingya until justice is done.

Today's coming into force of the DSA marks an historic and vital step forward in efforts to rein in Big Tech. Yet much is still at stake. Robust enforcement and implementation are critical if the DSA is to fulfil its promise and protect people from Big Tech's destructive practices.

The European Commission and EU members states now have a pivotal role to play in ensuring that the DSA is more than just a paper tiger. It is essential that EU regulators learn from history, and commit to ensuring that we never again see any repeat of Meta's role in the Rohingya crisis.

Pat de Brún is Head of Big Tech Accountability at Amnesty International.

Do you like the content of this article?
COMMENT (3)