We’ve made some good progress on challenging unfair government algorithms – but there’s a massive amount more to do

It’s not going to be an easy first term for the first year students arriving at university this month in the midst of a pandemic. But for many of them, arriving to start their course will have added significance because they nearly didn’t get to go to university at all.

The government wanted to mark them down on the basis of an algorithm which systematically discriminated against pupils who’d attended schools that had struggled historically.

It was only thanks to a huge effort – from Foxglove, from Curtis, the A-level student we teamed up with to launch a legal challenge, from the hundreds of thousands of you who signed his petition or chipped in to fund the legal challenge, and from the many other groups that joined us in speaking out – that the government finally dropped the algorithm.

Another dodgy government algorithm also bit the dust just weeks before. Foxglove, this time working with the Joint Council for the Welfare of Immigrants (JCWI), had launched a legal challenge against the Home Office’s use of an algorithm to “stream” visa applications, almost a year before.

As with the A-level algorithm, there was a serious lack of transparency in how this algorithm worked. And, just like the A-level algorithm, the decisions this piece of software was designed to produce looked seriously discriminatory – this time against visa applicants from poorer, mainly non-white countries.

As the case got close to going to court, the government finally caved in, scrapping the algorithm and pledging a full review of the system, including for issues of “unconscious bias” and discrimination.

These successes were inspiring to be part of and really exciting proof of the difference that Foxglove can make.

Scrapping these two algorithms will have made a real difference, not least to the lives of the students who would have otherwise missed out on university, and the visitors to the UK who would otherwise been denied entry by a computer programme designed to provide speedy boarding to white people.

However, these biased, unfair algorithmic decision-making systems should never have been adopted in the first place. And the attitudes and processes within government which led to them being adopted remain. So we’ve got a lot more work to do if we want to stop future unjust government algorithms.

Both the racist visa algorithm and the unfair A-level grading algorithm showed a government attitude of “compute first and ask questions later”. They see algorithms and artificial intelligence as a magic solution to all kinds of complex human problems.

They roll out systems like this without explaining them to citizens or asking people whether they want them in the first place. They refuse expert advice about how to avoid bias. This is a huge democratic problem – and it’s already arrived in other areas of our public life. 

It isn’t just grades, or visas. People are being denied benefits, housing, employment – even potentially arrested – because of flawed algorithms. Without proper democratic control these systems can ruin lives. They also waste taxpayers’ money on faulty programmes which end up being scrapped. 

So far, Foxglove has helped get two unjust algorithms scrapped. But what we’re really fighting for is a future where none of us can have our lives ruined by an unfair government algorithm. Here are some of the key principles we’re fighting for:

Democracy: It’s not enough to run a secret statistical analysis. Any system that will affect thousands of lives needs to be explained and justified to people – before it’s used, not after. If there’s no democratic mandate for the algorithm, it shouldn’t be used at all.

Transparency: The government should have a duty to tell people, up front, when an algorithm was used to make or influence an important decision about them. You can’t challenge what you can’t see.

Fairness: At the moment these systems often make inequality worse – disadvantaging the already disadvantaged.  Using an algorithm is no excuse to flout the laws that protect everyone’s equality, privacy and human rights laws. We need better safeguards against this bias.

Justice: People need to have a clear route to holding government to account when an algorithm goes wrong. Thousands of people across the country this week were scrambling for help and didn’t know where to turn. And people harmed by algorithms need redress – it’s on government to make it right again.

Foxglove are just getting started investigating, litigating, and holding government to account for abusive algorithms. This is no small task. It will require a lot of support. If you’re not yet signed up as a supporter, please do so. And if you’re able to make a regular donation to support this work, you can donate securely, online, by clicking the Donate button below.