Registration window open (1 Jan - end of Feb)

If you use personal data in your work you are legally obliged to register during January and February each year.
NEW REGISTRATION? View guidance and create new registration here
EXISTING REGISTRATION? Sign-in to Registrations Portal here


An ethical data future?

Published: 24 September 2018

In the first of a series of posts exploring data ethics, our commissioner Emma Martins, explains why ethics sits at the heart of our strategy, and outlines the complex area of data ethics where legislation, morals, and technology collide.

More and more people all over the world are talking about data ethics.

For those of you new to this conversation, data ethics explores the need for personal data to be collected and used in a way that does no harm, respects autonomy, benefits others, and is fair.

In short, it’s about ensuring that we do not sacrifice good human values in the ever accelerating technology race, making sure we do the ‘right’ thing with personal information. But, who gets to decide what the ‘right’ thing is? This question and the conversation that arises from it is something that we should all, as citizens, sit up and engage with.

The good news is that data protection regulators from around the globe have already started to engage with data ethics and are taking the subject seriously. For evidence of this you need look no further than the chosen theme of next month’s 40th International Conference of Data Protection and Privacy Commissioners - Debating Ethics: Dignity and Respect in Data Driven Life. Hundreds of delegates will take part in a week long programme with a powerful speaker line-up, including the man credited with creating the internet - Sir Tim Berners-Lee. Berners-Lee states that he has watched with devastation, as his creation has taken on a life he never expected or intended for it.

He was recently quoted as saying that the internet has produced “a large-scale emergent phenomenon which is anti-human.

As Berners-Lee observes, technology is not the clinical, neutral entity we expect or would like it to be. It is necessarily a reflection of the humans that create and use it, encompassing human bias, flaws and intentions (good and bad).

This is further exacerbated by the pace of this technological change, which takes it down unpredictable paths leading to the question of whether the technology is serving us, or are we serving it? But let’s be clear, this is not about saying technology itself is the problem, but often the way we use it - not thinking enough (or at all) about the impact of the processing and how it might affect us and others in the immediate and longer term.

Certainly it is true that many governments in advanced democracies have embraced these challenges. There are, and have been, laws that seek to regulate the processing of personal data in many countries across the world, the most recent being the highly publicised EU General Data Protection Regulation (GDPR). But the pace of legislative and regulatory change almost always falls way behind the pace of technological advances. So it is becoming clearer that we need more.

In July 2018, in response to her investigation of Facebook and Cambridge Analytica, the UK’s Information Commissioner – Elizabeth Denham, called for an ‘ethical pause’ for “Government, Parliament, regulators, political parties, online platforms and the public to reflect on their responsibilities in the era of big data before there is a greater expansion in the use of new technologies”.

Denham’s plea is significant as it goes beyond legislation, and issuing massive fines when things go wrong. It is an appeal to reason, to reflect on what we should do, not simply what the law tells us we must do. It's interesting to note that previous data protection legislation did not receive the widespread publicity that GDPR had. There is a reason for that – we are increasingly aware of how important data is for us as individuals and for society and the economy more broadly.

It is hard to think of any aspect of our lives which is not influenced in some way by data and opting out is no longer just difficult, it’s probably fair to say it’s now impossible. But despite much effort going into the drafting of the GDPR (and equivalent legislation such as our own law here in the Bailiwick) to make it as future-proof as possible, the pace of change and its unpredictability raises challenges. So we need urgently to widen the narrative beyond simply doing what is legal, to also doing what is right.

The Facebook/Cambridge Analytica case highlights the extraordinary influence public discourse can have. It also highlights the significance of transparency – of ensuring that we know what is going on and how it may be impacting us and those around us. This is the case in other important areas of our lives. Using single-use plastic products such as straws, water bottles and coffee cups is not illegal but we have seen a huge and largely public-driven push towards more environmentally-friendly alternatives in recent months which is arguably as powerful as any legislative or regulatory control.

I am of the view that we need both.

A robust regulatory framework that provides individuals with meaningful and enforceable rights, and at the same time we need a deep understanding of (as well as a public conversation around) the role ethics plays in questions of data. The more that the public do not tolerate mishandling or covert use of their data, the more companies will respond - that way the law becomes a proactive and positive compliance tool, rather than simply an aggressive enforcer that deals reactively after things have gone wrong.

A culture of compliance is better for everyone and will differentiate enlightened companies and jurisdictions in ways it has not done before. Our jurisdiction’s size allows us to be engaged, responsive and nimble in this area and we have already taken great strides. My office’s strategic plan is based upon ethical principles and we urge organisations to embrace those values too. Our office motto ‘Excellence through Ethics’ highlights how we are putting this approach at the heart of everything we do because we all have the opportunity to influence this conversation, however big or small our part may be. My plea to Guernsey’s regulated community, and to those developing new technologies echoes this motto.

If we want a future, where ethical data use lives and breathes, we must optimise not just for efficiency, or to make big companies even richer, we must optimise to reflect human values which in turn will deliver social and economic benefits for us all.