I guess we all have a hot topic: a burning issue or area that furrows our brow and sets us off in what I would describe as silent fury. For me, it's how young people are exploited digitally by tech giants and their investors who put commercial gain before kids' online safety.
A band of well-heeled shareholders of the £600bn mighty Meta (new name for Facebook) voted down a call to add an extra layer of protection for children via internet/mobile channels. I don't have to spell out my response. Baroness Kidron, an advocate for children's digital rights, attempted to appeal to their better nature during the annual general meeting by calling for better 'digital guardrails' for children. To no avail. The meeting was staged virtually or, given the insularity of the response, remotely is a more fitting term despite it being a global gathering.
The Baroness is a leading light in what represents a struggle with big business to face up to its digital responsibilities, especially when it comes to the most vulnerable members of society. She sits on the House of Lords democracy and digital technologies committee, and is founder and chair of non-profit 5Rights Foundation. It promotes the rights of kids online with signatories including Unicef, NSPCC and Barclays Bank. The Baroness is also a member of the UNESCO broadband commission for sustainable development and gained cross-party support for a key amendment to the UK Data Protection Act 2018.
The Meta experience represented the first time her group had directly weighed in at such a level. She stressed online: 'Year after year, resolutions to protect children come to the (Meta/Facebook) board that you block. The choice you are making is to put your commercial interest above the needs of children, even when it costs them their lives'. She added: 'The move was unsuccessful, with shareholders following Meta's recommendation and rejecting the proposal'.
It was led by shareholder Proxy Impact and would have required Meta to conduct and release an annual report that included metrics assessing whether the company has 'improved its performance globally regarding child safety impacts and actual reduction to children'. DC lawmakers have sought in the recent past to impose similar requirements for tech companies to vet their products for potential harm to kids before rolling them out to an unsuspecting marketplace.
At the time of writing, California is the only state signing such standards into law, ironically where Meta has its Menlo Park headquarters, so hopefully some pressure can be brought to bear. I doubt it. Big Tech is a law into itself. Minnesota, Maryland and New Mexico have each attempted to push through their own bills but ran out of time. Despite reported broad bi-partisan support, lobbyists have been unable to overcome political hurdles. Nothing new there then.
Fairplay for Kids advocacy group executive director, Josh Golin, was reported as stating of the Meta shareholder vote that a 'diversity of tactics' was absolutely needed 'to rein in the abuses of such a powerful industry'. Meta's board of directors succeeded by urging a majority of shareholders to reject the proposal, arguing it would not 'provide additional benefit'. To whom, I wonder. Meta claims it has developed over two dozen tools aimed at protecting children online, including regularly publishing information on safety efforts. Don't forget their emphasis on parental controls.
SpaceTalk UK stresses that any parent knows children have a knack for finding a way around almost any obstacle. Their problem-solving skills can be so advanced they take us by surprise, especially when it comes to technology. Take it as read that kids have figured out to bypass parental control on devices that makes certain screen time and safety settings on phones and tablets next to useless. With a few taps, they can access apps they're not allowed to access and exceed screen time limits an earnest parent has placed on a device.
You'll be public enemy number one but nothing new there. A parent can disable 'screen recording' in settings, for example, although it can always be turned back on when they're not paying attention. Also, a clever kid can uninstall, then reinstall, an app like Instagram letting them stay on it longer than allowed as they get around screen time limits. Again, it's a case of changing the phone settings to set delete apps to 'don't allow'. Also, you can always switch to a device made just for children, such as the Spacetalk Adventurer smartwatch phone.
Cybersafe Scotland social enterprise is well into a two-year Digital Wellbeing Online Harm Prevention project to help schools, local authorities and families to collectively ensure that young people are respected online. Founder-director, Annabel Turner, says that through a combination of interactive lessons, links to related resources and support worker sessions, with the aim to achieve 'early intervention to effectively maximise safety of children and young people in their online spaces'.
Try telling this to the Meta/Facebook merry band of directors and cosseted shareholders who, for once, should look beyond the highly lucrative advertising revenues gained from almost three billion monthly active users. Of which significant numbers are young folks.
Former Reuters, Sunday Times, The Scotsman and Glasgow Herald business and finance correspondent, Bill Magee is a columnist writing tech-based articles for Daily Business, Institute of Directors, Edinburgh Chamber and occasionally The Times' 'Thunderer'