The House of Commons Science and Technology Committee put out a public call for suggestions on subjects it should investigate and one of the three winning pitches came from Stephanie Mathisen, campaigns and policy officer at Sense about Science, who suggested an inquiry into transparency in algorithmic decision-making.
In an explanatory editorial, Mathisen explains that algorithmic decision-making is poorly understood and yet widely acclaimed, which sets the stage for a lot of bad outcomes: governments and corporations are under real pressure to adopt these systems, but to yield to vendor insistence that the decisions the systems deliver are "objective" (because they're somehow mathematical) and also not comprehensible by humans (because the software can't explain its reasoning).
It’s vital for the accountability of government that its decisions are transparent, that people are treated fairly, and that hidden prejudice is avoided. If government is using algorithms, it should be setting the right example.
To that end, it could apply a set of standards. A suggested code of conduct was published in November last year, including five principles of good algorithms: responsibility, explainability, accuracy, auditability and fairness.
There might need to be an ombudsman or third party regulator for people affected by algorithmic decisions to go to.
The committee’s inquiry is timely. The new EU General Data Protection Regulation is set to be adopted by Britain and EU member states in 2018. This legislation will govern how artificial intelligence can be challenged, and early drafts have included a “right to explanation”. This is something that should be guaranteed.
The issues with algorithms in decision-making aren’t future problems; they already exist. It is essential – for government, parliament and all of us as citizens – that we understand how decisions about our lives are being made.
Algorithms in decision-making inquiry: Stephanie Mathisen on challenging MPs to investigate accountability [Stephanie Mathisen/Public Technology]