We put a stop to the A Level grading algorithm!

Foxglove, together with A Level student Curtis Parfitt-Ford, have forced the government to U-turn on the disastrous A-level algorithm. This happened in the wake of our judicial review threats, mass petitions and national disgust. The government has announced it will axe the grading algorithm in favour of teachers’ assessed grades

In a last-minute U-turn, the government threw beleaguered students a lifeline by scrapping Ofqual’s unfair grading algorithm in favour of teachers’ assessed grades. The algorithm downgraded thousands of students’ marks and threw their futures into disarray. The government has been forced to face the consequences of its massive error.

Foxglove, working with Ealing A-level student Curtis Parfitt-Ford, urged the government to “grade the student, not the school” giving more weight to teachers’ assessed grades and to open a free appeal route for students on academic merit, including teachers’ assessed grades. In the wake of extreme pressure, the government has abandoned its algorithm entirely.

Together we launched legal proceedings against the government over the algorithm. In a pre-action letter sent on Curtis’ behalf on Friday, our legal team explained why Ofqual’s algorithmic grading exceeded its statutory powers, violated key principles of data protection law and set out further reasons why Ofqual’s process was unfair and unlawful.

This is a major win for a brave comprehensive student from Ealing, Curtis Parfitt-Ford. Curtis did well out of the algorithm individually and was pleased with his result. But well before Thursday’s disastrous results came out, Curtis felt deeply concerned that the algorithm would disadvantage students by treating them as statistical aggregates, not individuals. He wrote to his MP, launched a change.org petition that over 250,000 people signed, and teamed up with Foxglove on a judicial review—all for the sake of his fellow students.

The government must now urgently clarify how students will get their new grades and what will happen to those students whose grades were adjusted up rather than down – as well as those taking qualifications other than A-levels such as BTECs and other vocational qualifications.

We hope the this fiasco sparks a much-needed national debate about the use of opaque algorithms in public life. Not simply how they are used, but whether they should be used at all. In the future, consequential algorithmic systems must be designed and built in a way that is democratically acceptable and which does not cause chaos in thousands of lives.

You can sign up to support us with a monthly donation to make more cases like this one possible by clicking on the Donate button below. Thank you for being part of it.