Algorithms should not take the blame for last year's exam results fiasco, says the Swindon-based organisation that represents IT professionals in the UK.
In the face of this summer's A-level and GCSE examinations being cancelled again this summer, The Chartered Institute for IT urged politicians and the media not to 'shoot the algorithms' for the failures of policy is says led to the crisis last year.
Dr Bill Mitchell OBE, head of policy at BCS said: “Last year an ill-conceived algorithm was used to award exam grades to students, which assumed each school’s results had to be effectively identical to the previous year.
"That caused a huge loss of public trust in the very idea of an algorithm being used to make judgments of any kind.
“The real issue was poor communications and poor collaboration between government offices and departments because they were designed to be arm’s length, rather than closely integrated partners, resulting in an algorithm designed to do something woefully different to what politicians wanted.
“It is damaging to our future as an advanced technological society to blame an algorithm when the true problem was a collective lack of technology governance, which can be fixed should we choose to.
"Please let’s all work together to sort out the design and development governance for algorithms in public service, not just shoot the algorithm. It only did what it was told to do, that is the point of algorithms.”
In the wake of last summer's exam results crisis, BCS carried out a survey which showed over half (53 percent) of UK adults said they had no faith in any organisation to use algorithms when making judgements about them, in issues ranging from education to welfare decisions.
Just seven percent of respondents trusted algorithms to be used by the education sector.
In a separate report, The Exam Question: how do we make algorithms do the right thing? BCS recommended the government endorse and support the professionalising of data science, in line with a plan already being developed by a collaboration of the Royal Statistical Society, BCS, The Royal Society and others.
It would mean algorithms whose creators did not meet a strict code of practice for ethics, professionalism and competence cannot decide issues such as exam grades or make estimates about the outcomes of pandemics like Covid-19 to government.
"We need more professionalism in this area of AI, so that it can be used for the greater good, ethically, and efficiently," said Bill.
"We want to see a professionalised data science industry, independent impact assessments wherever algorithms are used in making high-stakes judgements about people’s lives, and a better understanding of AI and algorithms by the policymakers who give them sign-off."