Tech worker justice
When you hit the “report” button on social media, it can feel like the post you flagged has travelled off into a void. Yet, behind the faceless dialogue boxes and menus to “tell us what’s wrong”, there’s an army of real human beings spending their days reviewing some of the worst content that exists online. Those people are content moderators.
They are a global workforce of tens of thousands tasked with filtering the worst content people upload to social media, such as child abuse, beheadings and hate speech. This work, which is in its way as unregulated as early industrial factories, gives many workers PTSD.
We work with content moderators in multiple countries to investigate workplace abuses, bring litigation, and organise for better conditions. We support litigation Europe wide, helping moderators who have suffered conditions like PTSD from their work take action for workplace damages. We facilitated the first meeting between an elected officials and content moderators, so those in power could hear directly from those that needed their help.
If you are a content moderator and you’d like our help, or if you would like to know more please click here.