NEW CASE: Foxglove supports Facebook content moderator sacked for leading workers to form a trade union in Kenya
Imagine spending all day dealing with the world’s worst violent Facebook posts on as little as $1.50 an hour. Now imagine you do this for a company that calls itself an ‘ethical AI’ firm – and that they fire people who stand up and complain.
That’s exactly what happened at Facebook’s largest outsourcing centre in Africa. This week, TIME published a report revealing exploitation of Facebook content moderators at Sama in Nairobi, Kenya.
Organised by content moderator Daniel Motaung, over a hundred moderators sent a letter to Sama, Facebook’s outsourcing company in the region, accusing it of: “subjecting people to emotional trauma, paying mediocre salaries coupled with the general mistreatment of people.”
Workers from Kenya, South Africa, and many other countries came together to negotiate with the company, as is their right under Kenyan law.
They threatened to strike within seven days unless they received better pay and working conditions.
Sama responded with aggressive union-busting tactics, sacking the proposed strike’s leader, Daniel Motaung who they accused of putting the relationship between Sama and Facebook “at great risk”.
When he was fired he also lost his work permit – so he had to pack his bags and leave Kenya in three weeks.
Other content moderators say management told them they were expendable – especially the ones from Kenya – and that they should stop complaining and get back to work.
One moderator had choice words for this: “They made sure by firing some people that this will not happen again. I feel like it’s modern slavery, like neo-colonialism.”
Another told TIME: “At Sama, it feels like speaking the truth or standing up for your rights is a crime.”
To make matters worse, Sama describes itself as an “ethical AI” company and is a certified ‘B Corp’. B Corps are companies that are supposed to meet high standards of social and environmental performance, transparency and accountability.
It’s unclear how running an office that exposes people to PTSD and throws them out of the country when they ask for better conditions squares with these high standards. Or with being an “ethical AI” company.
Sama is one of many companies around the world that Facebook use to farm out content moderation, allowing it to wash its hands of moderators’ awful working conditions, poor pay and lack of decent mental health support.
Foxglove is proud to be supporting Daniel in a legal challenge around his dismissal and the exploitation he was subjected to at Nairobi’s Facebook content moderation office.
We’re doing this with our Kenyan legal partners Katiba Institute. If you are a moderator who works or worked for Sama Nairobi, contact us confidentially: email@example.com.
We have launched an open letter to Mark Zuckerberg calling on him to finally begin treating Facebook’s content moderators – social media’s emergency workers – with the dignity, respect and protection they deserve. Click below to sign your name.
We’ll have more updates on Daniel’s story soon. To keep up to date with this case, and others like it, sign up to our mailing list by hitting the button below.