Why robots need seatbelts

Published: 6 March 2024

This article was first published in the Guernsey Press on 6 March 2024.

Bailiwick of Guernsey Data Protection Commissioner Brent Homan explains why successful and enduring innovation requires safeguarding individuals’ information from the outset.

“The distant future, the year 2000. The world is quite different since the robotic uprising of the late 90s. There is no more unhappiness. Affirmative.” – Flight of the Conchords, lyrics from ‘The Year 2000’

While this New Zealand duo’s techno-apocalyptic song parodies a Terminator-style world ruled by robots, there is a prescient point to be gleaned on the risks of tech advancement run amok.

When it comes to tech innovation, its impact on privacy and data protection can only be described as revolutionary. The expansion of computing power, data-storage and bandwidth has greatly outpaced historical innovations such as the introduction of electricity and the telephone. In terms of cost and accessibility just ponder that in 1995, a gigabyte of data storage cost £670 while today that has dropped to a few pence.

And we have witnessed how the tech revolution has transformed the global economy. Let’s take the ‘year 2000’ as a baseline. Facebook and Twitter did not exist. save for Microsoft, the largest global companies were dominated by the GEs, Walmarts and oil & gas companies of this world.

Fast-forward to today and six of the top seven largest companies in the world are literal digital giants including Microsoft, Apple, Google, Meta and NVIDIA (AI computing).

It’s now 2024, not 2000, and the world has changed. Make no mistake, as a society we have benefited immensely from the digital and technological revolution.

We are living longer, predicting weather patterns better, and overall have realised a tech-era driven improvement to our standard of living. But as with any innovation, tech advances will only prosper, provide societal benefits and endure, if they are developed in a manner that is safe, secure, and respectful of the rights of those very individuals whose interests and lives they are intended to serve.

The introduction of the automobile was a massive 20th century innovation expanding individuals’ reach of their world. But without the safety features of seatbelts and airbags, alongside adherence to traffic laws, the lethal consequences of road travel would prove mortally reckless and universally untenable.

And so it is with the digital age and the paramount importance of ensuring that any tech innovation involving people’s personal information is developed in a manner that fully respects individuals’ privacy and data rights.

Unfortunately, there remains a minority perspective that data protection holds back innovation – that the only important thing is to innovate fast, get to market, and figure out those silly regulatory details later. To be blunt, that is a recipe for business and technological failure.

It is a myth that innovation and privacy work against each other. In fact, the ONLY way that an innovation can be successful and endure over the long term, is if respect for individuals’ data protection rights is built in from the outset. Data protection represents the seatbelts and safety laws of our info-driven era.

The objectives of data protection legislation include promoting privacy rights and ensuring that individuals can reap the economic and societal rewards of technology while maintaining control over their personal information and avoiding harms by its unlawful misuse and collection.

Let’s consider the example of Facial Recognition Technology. Facial recognition involves the collection of the biometric data of individuals from their images carried out in real-time.

It has clear societal benefits, from combatting terrorism, to authenticating individuals’ accounts with a degree of confidence that can reduce the risk of identity theft.

But a few years back a company called Clearview AI took the technology to the next level, indiscriminately scraping billions of images from social media and other digital platforms and then marketed it to both private sector companies and law enforcement authorities around the world.

In fact, if you are reading this right now and have a social media account, chances are that Clearview has an image of YOU in its database.

I suspect that most people were unaware of that, and they certainly didn’t provide consent for such a collection, with the potential to place innocent citizens in a ‘virtual police line-up’. Data Protection regulators took note of this risk, with multiple investigations and actions being carried out by the UK, Australia and Canada.

Other examples abound about tech innovations that while cutting edge, were prematurely implemented without the proper privacy safeguards in place.

Which brings us to generative AI and the limitless potential of tools such as ChatGPT to assist research, organise and articulate subjects and ideas at a lightning-fast pace.

The world has taken notice of this potential. In fact, ChatGPT’s adoption rate is without historic precedent.

As Rachel Masterton (Deputy Commissioner at the ODPA) points out in a recent Business Brief article on AI – while telephones took 75 years to get 1 million users, mobile phones 16 years, and Twitter 5 years. ChatGPT took just…. 2 months.

But while the potential economic and societal benefits of GenAI are very real, including benefits for medicine and education, so too are the risks to privacy and other rights.

Those risks include non-authorised scraping of personal information, accuracy, intellectual property, discrimination, and oh yes... don’t forget about those ‘Robots’- as some have even identified GenAI as an existential threat to humanity. Gulp!

And this is the VERY reason why it is vital that data protection safeguards are carefully and thoughtfully factored into tech innovations.

AI and other info-era innovations have such promise for the global community and are here to stay. But for them to succeed over the long term and serve us rather than injure us – we need to have those ‘seat-belts’ installed, and the data protection road-rules followed.