Death by design: a major new case against Facebook 

Just over a year ago, Professor Meareg Amare Abrha, a respected chemistry professor at Bahir Dar University, Ethiopia, was murdered outside his family home.  

He was heading back from work when a group of armed men attacked. They shot him in the legs and in the back, and left him to bleed to death on the pavement over several hours.  

How did these murderers know who Professor Amare was and how to find him? Facebook. Weeks before the attack, two racist posts on Facebook had published his photo, doxxed him (pinpointing the neighbourhood where he lived), and called for his murder. 

The Professor’s tragedy, sadly, is one of thousands. Across the world, we’ve seen how Facebook’s design has fanned the flames of hatred and violence. We’ve seen it in Myanmar, Sri Lanka, India and even in the US, where viral incitement helped spur the January 6 Capitol riots. 

But the violence Facebook has enabled in Ethiopia may be worst yet. The Ethiopian war is one of the deadliest in the world, and has claimed the lives of more than 500,000 people. Professor Amare was one of them. 

Today we are proud to be supporting the launch of a major new case demanding fundamental change to Facebook’s algorithm, prioritising the safety of the 500 million people who live in Eastern and Southern Africa over Mark Zuckerberg’s profits. 

A constitutional petition to the Kenyan high court 

The case is what is called a constitutional petition filed in the high court in Nairobi, Kenya (Facebook’s East African content moderation hub). The case is being brought in Kenya because it’s the epicentre for content moderation for the entire region – where decisions are made that affect the lives of hundreds of millions of people.  

For those not familiar with Kenyan law, a constitutional petition asks the court to make orders in the name of the Kenyan constitution, the supreme law of the country. 

In this case, the petition asks the court to order Facebook to make fundamental changes to its operations, to end Facebook’s role as the premier online tool for feeding violence and hatred: 

  • Stop promoting viral hate   
  • Start demoting violent incitement – using similar emergency steps to those Facebook took after the US Capitol riots of 6 January, 2021  
  • Employ enough content moderators to staff the language markets moderated out of the Nairobi hub  
  • Apologise for the Professor’s death 
  • Create a restitution fund to be assessed by the court for victims of hate and violence incited on Facebook – we are seeking 250 billion KSH ($2bn) for harm from posts, and a further KSH 50 billion (approximately $400m) for similar harm from sponsored posts. 

If successful, this will be the first time a case has forced changes to Facebook’s algorithm. That’s the process which Facebook uses to make content go viral for profit, a process laid bare in last year’s blockbuster Facebook Files revelations by whistleblower Frances Haugen.   

The Facebook Files also exposed Facebook’s woeful failure to invest in safety systems in what Facebook calls the “Rest of World” – Facebook’s dismissive and neo-colonial term for the combined peoples of Africa, Latin America and the Middle East. 


When the posts doxing him and calling for his death were posted on Facebook, Professor Amare refused to hide. After all, he didn’t even use Facebook himself. 

He felt his decades of service in the community would protect him. Then he was murdered. 

Professor Amare’s son, Abrham, was pushing desperately for the posts to be taken down. Less than a week ago, despite all of his efforts, one of the posts calling for the death of his father was still live on Facebook. (They appear to have taken it down only now, having got wind of this case). 

Abrham is one of the petitioners bringing this case. He holds Facebook directly responsible for his father’s murder. He wants to make sure that no more families go through what his did. He also wants Facebook to apologise for his father’s death. 

Also bringing the case is Fisseha Tekle. Fisseha is a legal advisor at Amnesty International and was formerly the Ethiopia researcher. His independent reports into violence by all parties to the Ethiopian war made him a target for abuse on Facebook. His evidence details how Facebook’s moderation failures made Amnesty’s critical work of human rights reporting impossible – and risked others’ lives. 

The final petitioner is the Katiba Institute, Kenya’s preeminent legal organisation set up to defend the Kenyan Constitution. They are joining the case to set out the implications for Kenya of unchecked viral hate and violence running rampant from Facebook’s Nairobi hub.  

At Foxglove, we believe we all deserve social media that connects us, rather than dividing us.  

Facebook lets hate spread because it makes huge amounts of cash. Stopping violence costs money – but would still be just a fraction of its vast profits. 

Zuckerberg can do far, far more. We know because he *did* do more with the “break the glass” measures taken overnight in response to the Jan 6 Capitol attacks. And he staffs many more moderators for the US. Until now, Facebook hasn’t invested in the safety systems necessary to protect millions of Africans. Now, we aim to change that. 

This case has taken a year of painstaking investigative work. But it’s only beginning. It will take all of us, standing with these brave petitioners, to force Facebook to change – and we’ll be in touch in the New Year with specific ways you can help. 

Hit the button below to stay up to date with this case and others like it. You can sign up to hear more about this case and others like it using the button below.