The next time you happen to be strolling along Princes Street mulling over that well-earned cuppa, take a moment to contemplate the bold claim that you could shop more efficiently if you thought more like a computer.
Cue shades of HAL in Arthur C Clarke's monumental film
2001: A Space Odyssey, raising the key question of just who is in charge? Humans or the machines? This Digital Age is increasingly influencing and affecting each and every one of us, whether we want it to or not. Computers are seamlessly interconnected to artificial intelligent (AI) machine systems (and us) that simulate human intelligence. It's a highly-sensitive relationship that is only going to get more complex in a few short years.
It's not only Auld Reekie. They're everywhere, although personal computer shipments globally are expected to be down by a mighty 10% this year, according to tech analysts from Gartner. This is due to microchip shortages amid geo-economic uncertainties encapsulated in the Russian war on Ukraine and lingering threats of recession. Robots are one such program and the subject of countless science fiction movies, including Stanley Kubrick's directorial masterpiece from 1968. They usually bring our planet to an end or save it. Cheery, eh?
Not long ago, Deloitte Touche released a report,
The Robots are Coming, but not the scary metal-encased mechanical sci-fi version, nor a threat to humankind. No, these robots are software-based and reside on desktop computers and servers, known as Robotic Process Automation (RPA).
Currently there are various robotic versions operating in general society, and semi-autonomous and autonomous humanoids intimately involved in military, industrial and medical roles involving a plethora of AI healthcare tools. MIT points to a
Lancet Digital Health first-time large-scale study on how radiologists, assisted by AI, diagnose breast cancer more successfully than when they work alone. There's even a dog therapy robot.
Much in the news, cryptocurrency digital coins are, in reality, no more than endless strings of computer code with no actual physical form. Then there are 'apps'. You can't go on a website without being offered such a computer program/software application. If you want to delete one, you may very well get sent an 'uninstall app' app to delete it. Plus there are algorithmic-driven 'bots'. You know the type: constant phone calls to your landline, when a metallic voice offers you a pay day loan you hope you do not need, or pressures you about a non-existent motor accident. When you don't own a car.
Finally, there's that ubiquitous mobile in your pocket forever making covert bleeps and pings you haven't a clue what for, and home-based Alexa smart speakers monitoring our every move. And don't get me started on the Instagrams and TikToks when it comes to the worst excesses of overtly intrusive social media control and favouring so-called celebrities over substance.
It's still argued that such an assembly line of innovative tech is taking over our lives. Nevertheless, in what seems the wrong way round, it is claimed that as mere humans we should be more like a computer: 'stereotypically deterministic, exhaustive and exact' in our decision-making.
So says one of those TED Talk videos that roam freely about the ether as if they have a cyberlife of their own. This particular one contains the suggestion that as individuals we do not appreciate the 'full spectrum of decisions' faced daily in our lives and at work. That, as a species, we rarely have all the information at our disposal that is required to eliminate the risk factor. Instead, it should be all about 'optimising' decisions and this requires speed as digital transformation contains unabated in our busy lives. The advice from TED is that by doing such we can avoid the trap of 'paralysis by analysis', where we don't take a decision because we know more analytical thinking is required.
Feeling rather parched, I was building up a tempo all right, a bout of warp speed for that cuppa. A simple enough task one would think but it seems it's not as easy as that. Deciding what cafe or restaurant to go to has a particular computational structure with a set of options over which one will be chosen. In such a situation, we run up against what computer scientists call the 'explore-exploit trade-off'. Who would have known?
Are you going to try something new (exploring the unknown) or gathering information you might use in the future; or are you going to a place you already know is pretty good (exploiting data already gathered). Such a trade-off shows up any time you have to choose between trying something new and going with something you already know about.
TED tells me that over the last 60 years computer scientists have made a lot of progress understanding the explore/exploit trade-off. Back to that cup of tea I haven't had yet. The first question I should be asking myself is how much longer I'll be in town.
If it's just for a short time then I should exploit, for longer explore. This is described as giving all of us insight into the very structure of human life. Now, to me it's all blindingly obvious but TED covers that too: the old guy who always goes to the same place isn't boring, he's 'optimal': exploiting the knowledge he's earned through a lifetime's experience.
Welcome to the world of computational cognitive scientists. The claim is that, by applying a little bit of computer science nuance, this can make human decision-making easier. Our lives are filled with computational problems that are just too hard to solve by applying sheer effort. In such cases, it's worth consulting the expert. That expert being the computer scientist. Mind you, TED is the first to agree that when looking for life advice, they're probably not the first grouping one might think of.
Meanwhile, there's a lot riding on human-and-computer decision-making. None more so than in monetary terms. Big Tech continues to drive global stock markets and gives an idea of the sheer scale of such a predominant sector. Forester Research predicts the aforementioned RPA market will hit £5.5bn by 2025 rising to a staggering £20bn-£25bn by 2030. Switching to dollars: new research reveals Microsoft, Apple, Google and the like are each closing in on $1,000 in earnings from systems associated with artificial intelligence: every second of every day.
Scotland's chief data officer, Albert King, told Forbes that AI's potential can only be harnessed for societal benefit if it is both 'transparent and trusted' through ongoing development and in such a way that people know and understand how to engage with it. This involves the entire gambit of analytics, automation and AI. Each powerful in their own right, he claims that bringing them together adds up to more than the sum of the parts. Such crucial linkages can shed light on complex problems like improving child health, including nutrition, physical education, better mental health outcomes plus equality of opportunity in education and employment.
However, King emphasises that AI has to earn people's trust as associated ethics, personal privacy and security emerge as major considerations. It can all get rather overwhelming as our lives become increasingly reliant on computers. Will we ever reach the point at which we hand over the humanity digital keys to the cyber world?
A newly-published book casts doubt on this query:
How the Mind Changed by Joseph Jebelli. It highlights the uniqueness of the evolutionary development of our brains and with it countless genes, 'each one coding for a slightly different molecular spring, cog, gear and dial in the ever-expanding clockwork of the mind'. Could computers ever reach and possibly keep up, never mind match and overtake such development occurring over several million years?
TED stands for 'Technology, Entertainment, Design', and HAL 'Heuristically programmed Algorithmic computer', representing sentient artificial general intelligence. Now we know. Time to find that greasy spoon for a well-earned cuppa. Or should it be coffee, hot chocolate, Horlicks, glass of milk, Irn Bru, single malt? Now, if I thought more like a computer, I would be deterministic, exhaustive and exact. But I'm only human. Decisions, decisions. A mug of tea it is.
Former Reuters, Sunday Times, The Scotsman and Glasgow Herald business and finance correspondent, Bill Magee is a columnist writing tech-based articles for Daily Business, Institute of Directors, Edinburgh Chamber and occasionally The Times' 'Thunderer'