Breaking the code of silence: what we learned from content moderators at the landmark Berlin summit

One of the things you run into time and time again in the fight for justice for tech workers is the Big Tech omerta. The non-disclosure agreements used to try to terrify workers into silence, stopping them from speaking out about how they are exploited – or even talking to their closest loved ones about what they’re going through.  

So it felt surreal last week to watch a large group of social media content moderators from Facebook and TikTok, at the first-ever summit for content moderators in Berlin, not just breaking this code of silence but doing so with obvious joy.  

They were listening, smiling, shaking hands and exchanging hugs. It was a group of people sick and tired of being told to suffer in silence and choosing instead to reject the threats of retaliation from tech bosses for telling the truth about their working lives. And it clearly felt good

The content moderators – around 50 of them – met in the bright but chilly basement of the ver.di trade union in East Berlin on the bank of the river Spree, a stone’s throw from both Google’s building in the German capital and from the former path of the wall that once cut the city in half. The summit was co-hosted with ver.di, Foxglove, Superrr Lab and Aspiration Tech. 

Having travelled from across the country, the moderators discussed their experiences in detail: TikTok vs Meta platforms, outsourced vs directly-employed and Berlin vs elsewhere. Commons threads soon emerged. Together, they put together these demands to Big Tech: 

  • Fair pay for content moderation work reflecting the skilled expertise required to do the job. Key benefits such as sick leave, holiday time, flexible working, working from home, break and meal times must be set out clearly and consistently.  
  • Content moderation is dangerous work and, without safeguards and mental health care, it causes serious harm. Each company must obtain expert advice on the safeguards required and provide independent, 24 hour clinical support  
  • It is the duty of social media companies to actively encourage content moderators to organise, collectively bargain and join a union as well as form Works Councils across the industry. 

Across the board, moderators described terrible pay, the traumatising nature of the work and the lack of appropriate mental health support to do the job safely. But perhaps most of all, a feeling shared again and again was anger at the lack of respect they were shown by their employers: both the outsourcing companies and their tech giants clients. 

Moderators aren’t stupid. They know exactly how difficult the work is and how much skill and judgment is required. They know how essential they are to the tech platforms – without content moderation and the workers who do it, there’s no social media, full stop. 

Many talked about staying in the job, despite the poor pay and danger to their health, because of their personal moral commitment to trying to make the internet safer – thinking about their children or other family members who use social media and the protection they need. 

Yet they are hidden away, often outsourced and denied the respect and dignity they are due for their work. Many described themselves as feeling like the “online police” – and wanted the recognition they deserve for keeping their communities safe online.  

Fair pay, safe working conditions, being able to join a union and getting treated with respect and dignity for work that meets an essential need in our society – doesn’t really seem like a lot to ask, does it? 

We’ll be sharing more from the summit soon – so sign up to our mailing list below to hear more.