MPs cite Foxglove in report on how social media algorithms helped to created danger on UK streets last summer

Foxglove has been cited in a report by parliament’s Science, Innovation and Technology Committee which found that social media companies put the public in danger by incentivising the spread of misinformation and harmful content, in relation to the racist violence that struck the UK last summer. 

We submitted evidence to the committee earlier this year, drawing on our experience with how social media algorithms promote violent content online that leads to bloody and horrific consequences in the real world

As committee chair Chi Onwurah said, social media companies are not “neutral” – they “actively curate what you see online, and they must be held accountable”.  

We couldn’t have put it better ourselves. Social media is not an equal public square, where all viewpoints are given a fair hearing by its algorithmic gatekeepers.  

Instead, the only criteria that matter to the likes of Meta and TikTok are to keep as many eyeballs on their products as they can, for as long as they can. And they quickly learned that promoting sensational, extremist content is the cheapest and easiest way to do that. 

As we said when submitting our evidence:  

Meta’s ad income for 2023 was just north of $130 billion. Its advertising model is based on engagement. The longer they keep us clicking, the longer they can serve us ads. And the best way to keep us clicking is to serve up the most toxic and shocking content to provoke an emotional reaction. 

Here’s the full point from the select committee’s report, which you can read in full here

We heard that social media algorithms can play a major role in promoting misinformation and harmful content. The design principle of maximising engagement for profit means that algorithms can amplify content regardless of accuracy or potential for harm. Indeed, harmful and false content is often designed to be engaging, so may be promoted more than other types of content. Examples include mis/disinformation, violence, extremism, prejudiced views, suicide and self-harm content. 

The committee’s concerns do not appear to have had any effect so far on the extremely close relationship between the UK government to Big Tech.  

Just last week, despite concerns raised by Foxglove and many others that Meta’s role in spreading hate and violence at home in our country and abroad should mean it is not a suitable partner for the British government, the Technology Secretary Peter Kyle announced a new £1m partnership with the company. 

Kyle may have a short memory, but we don’t. We still remember the burning of a hotel with refugees inside in Rotherham last year, which the town council called a terrorist attack.  

As the i newspaper revealed, one of the first steps towards the eventual burning of the hotel was a Facebook post from a 34 year-old plasterer who lived a few streets away. 

In the since-deleted post, they wrote: “Right people it’s time to wake up, take a stand make our area safe for our women and children.” 

They added that the nearby hotel was housing “hundreds of migrants”.  

Then came the threat: “As long as they are there it’s a potential risk to our community. Look what happened in Southport. That could be your child. We need them out.” 

Posts like that led to the burning of that hotel. That’s why we’re not letting this go. 

Foxglove will be monitoring the government’s response to the select committee’s report with interest – and we will continue to support the cases of people like Abrham Meareg to fight back against the violence and pain inflicted in the real world by the social media’s amplification of toxic content online. 

For updates on this work as they come in, hit the button below: