BCS, The Chartered Institute for IT has published a report on the recent exams algorithm crisis, making a wide set of recommendations for government around new standards that must be met for all algorithms which have life-changing consequences.
The report by the Swindon-based professional body for the IT industry says algorithms must meet ethical and professional standards to recover public trust.
Government use of data science must achieve public service standards of openness, accountability and objectivity to avoid another ‘computer says no’ moment in education and other discipline, says the organisation.
The new report: ‘The Exam Question: how do we make algorithms do the right thing?’ recommends government endorse and support professionalising of data science, in line with a plan already being developed by a collaboration of the Royal Statistical Society, BCS, The Royal Society and others.
It would mean algorithms whose creators did not meet a strict code of practice for ethics, professionalism and competence cannot decide issues such as exam grades or make estimates about the outcomes of pandemics like Covid-19 to government.
The BCS study concluded that policymakers should ensure ‘the best possible ethical and professional practice in algorithm design, development and testing is ubiquitous at information system level across government and industry.’
All algorithms and data with high-stakes consequences such as grades estimation or triggering lockdowns should be put through an impact assessment against widely recognised ethical standards, and open to public scrutiny, before ‘going live’, the BCS’ report added.
Dr Bill Mitchell OBE, director of policy at BCS, The Chartered Institute for IT said: “The exam crisis has given algorithms an undeserved reputation as ‘prejudice engines’ when in fact ethically designed algorithms fed on high-quality data can result in massive benefit to our everyday lives.
“Lack of public confidence in data analysis will be very damaging in the long term. Information systems that rely on algorithms can be a force for good but, as students found out to huge cost, we have been using them to make high-stakes judgements about individuals based on data that is subjective, uncertain and partial.
“We need true oversight of the way algorithms are used, including identifying unintended consequences, and the capability at a system level to remedy the harm that might be caused to an individual when something goes wrong.
“That means, first of all, professionalising data science so that the UK has the most trusted, ethical and sought-after data science teams in the world.”
The report was made public yesterday (Tuesday, September 1) ahead of the Education Committee hearing with Ofqual today (Wednesday).