You don't have to get an A+ – or even need to sit the examination – to realise that according to Aristotle (or was it Clint Eastwood, Jennifer Lopez... numerous folks seem to have said it) you only get out of something depending what you put in. The same appears to go for the mighty algorithm. That's the theory anyway behind this otherwise pretty obscure sounding digital tool, labelled the algo, that hit mainstream and not in a good way, following the UK exam grades controversy.
Unfortunately, even the creator of mathematical logic that laid the foundations down for the PC would probably consider the incident that impacted negatively on unsuspecting pupils across the land as all Greek to him. For the uninitiated, and there's lots of us about, the term is fittingly influenced by ancient Greece with the word 'arithmos' – ie number – although the concept originally came from a Persian scholar.
In these modern times, a computerised algorithmic process represents a set of rules to be followed in calculations or other problem-solving operations. Still sound complicated? Join the club. However, such computer science algo principles are increasingly being employed to decide all sorts of everyday things. Like the outcome of a job interview, whether one gets a mortgage or bank loan, and in this particular case, achieving anticipated and expected grades to gain that key university of choice placing. Or not, as many deeply upset young folks recently found out.
Applying an algorithmic solution must have six properties: a specified input and an output, in-between a definiteness, effectiveness, finiteness, and it must be independent. It all sounds fine and dandy. Not quite. How often have you been beavering away on that key email, to suddenly discover without warning the autocorrect has changed a key word to make a point you're making garbled and plain nonsense. Autocorrect uses not one but a trio of algorithms to generate suggestions from a given list hidden from you. How can three so-called complementary algos somehow work against each other? Perhaps there's the rub.
More extreme examples of artificial intelligence employing sets of algorithms have involved a self-driving car in a fatal accident; software supposed to filter information but exposing youngsters to inappropriate content; a GPS system issuing faulty instructions. Then there's the exams exercise where apparently larger classes did not have teacher assessments taken into account, but classes of fewer than 15 did. Conspiracy theories apart, according to www.hmc.org.uk
, independent schools have some of the lowest student-staff ratios in the UK – ie one teacher for every nine pupils – compared with one teacher to every 22 pupils in the state sector.
Call me niave but, somehow, this doesn't weaken the suspicion of a postcode lottery where some pupils are disadvantaged whilst others are given a leg up in terms of their exam pass mark. Was this supposed to happen? It doesn't help that the algorithm backstory is one of an unstable technological proposition, as already shown above. Harvard Business Review
(HBR) warned as recently as 2017 that much can go wrong when incorporating such artificial intelligence-based algos into a project.
Above all, strict monitoring plus a damage mitigation plan must be adopted. In the case of exam result outcomes, surely this occurred. Otherwise, an early warning of potential unforseen consequences would have surely flagged up. Okay, maybe it did, but by then the damage had been done. Perhaps it's no coincidence that algo is a Greek neuter noun meaning pain. HBR goes further, claiming that it is harder to find examples of AI that don't fail. So was the exams exercise during a pandemic doomed from the start and an accident waiting to happen? Could the debacle have been avoided?
The much hyped multi-billion dollar AI market still has some way to go in terms of good old-fashioned trust. Respected website, ITpro, is blunt in suggesting that AI algorithms are the new 'snake oil' increasingly slithering into all aspects of life.
Analogous to this situation is Zoom, used in the early days of COVID-19 after a global marketing push and despite growing anxieties over cybersecurity issues. Zoom has since promised tighter controls but is currently causing serious outages bringing chaos for teachers as they attempt a delicate blend of online classes with physically returning to school. An educational double-whammy.
Back to the exams – or rather no exams – fiasco. In Scotland, the new moderation system introduced by the Scottish Qualifications Authority (SQA) in such extraordinary COVID-19 circumstances was undoubtedly well-intentioned. Wisely, now in excess of 120,000 downgraded results are, instead, referring to predicted grades.
So teacher does know best. To round matters off, time for an exam question: Who declared that hindsight is always 20:20? Although attributed to film director Billy Wilder, the notion began with English poet William Blake. With a nod to Aristotle. Discuss...