In a legal tempest, Facebook stands accused of fanning the flames of hatred and violence during Ethiopia’s civil conflict, a lawsuit alleges. Abrham Meareg, whose academic father was brutally slain following incendiary Facebook posts, is spearheading this charge against Meta.
The plaintiffs demand a colossal $2 billion fund for hate victims on Facebook and a revamp of the platform’s algorithm. Meta counters, emphasizing their substantial investments in moderation and technology aimed at purging hate speech. A spokesperson asserted, “Hate speech and incitement to violence violate our platform’s standards.”
Guided by local civil society feedback and global institutions, Meta insists its safety efforts in Ethiopia are robust.
Catastrophic Conditions
Filed in Kenya’s High Court with backing from the campaign group Foxglove, the lawsuit highlights Meta’s Nairobi-based moderation hub. The Ethiopian conflict, a devastating clash between the government and Tigray region forces, has claimed hundreds of thousands of lives, with 400,000 more enduring famine-like conditions. A recent surprise peace deal has done little to quell ethnically charged killings between Amhara and Oromo communities.
In November 2021, Abrham Meareg’s father, Prof. Meareg Amare Abrha, fell victim to this violence. Hounded by armed motorcyclists, he was fatally shot near his home. As he lay dying, threats from the assailants kept witnesses at bay, his son recounts. Prof. Meareg succumbed to his injuries after agonizing hours.
Prior to this harrowing attack, Facebook posts had defamed him, revealing personal details. Despite numerous complaints through Facebook’s reporting tool, these posts remained online until it was tragically too late. One was only removed posthumously, another lingered even longer.
Grossly Insufficient Moderation
“If Facebook had curbed the spread of hate, my father might still be alive,” Abrham Meareg contends, seeking justice and a personal apology from Meta. In his court statement, he argues that Facebook’s algorithm amplifies “hateful and inciting” content to drive user engagement. He criticizes the moderation in Africa as “woefully inadequate,” with too few moderators for crucial languages like Amharic, Oromo, and Tigrinya.
Meta, which owns Facebook, refutes these claims, stating it employs staff with local expertise and continuously enhances its ability to detect harmful content in major local languages. The company asserts that Ethiopia, despite less than 10% of its population using Facebook, remains a priority. Measures taken include reducing post virality, expanding violence policies, and improving enforcement.
Broader Implications
Peter Mwai of BBC Reality Check notes this lawsuit as a pivotal move to hold a social-media giant accountable for its role in the Ethiopian conflict. Critics argue that Meta and other platforms inadequately curb the spread of disinformation and hate content, often acting too late and with disproportionate measures across different languages.
Despite Meta’s insistence on significant investments to combat hateful content, the opacity around the number of moderators focusing on Ethiopia fuels skepticism. This case echoes past accusations, such as those from whistleblower Frances Haugen in 2021, who revealed to the US Senate that Facebook’s algorithm escalated ethnic violence by promoting divisive posts.
Additional plaintiffs, including the Katiba Institute and Amnesty International’s Fisseha Tekle, claim Facebook’s moderation failures endangered their lives and obstructed vital human-rights reporting. They urge the court to compel Facebook to:
- Establish a $1.6 billion restitution fund for victims of hate and violence.
- Prevent its algorithm from promoting incendiary content.
- Hire sufficient moderators to ensure equitable content translation and moderation between Nairobi and the US.