Hidden harms and safety nets

Published: 1 August 2022

This blog was written by our Commissioner, Emma Martins, and first appeared in The Guernsey Press on 1 August 2022. 

Failing to protect people’s personal information may not seem as dramatic as drink driving, but Bailiwick Data Protection Commissioner Emma Martins explains that the consequences can be just as devastating for victims.

I remember, many years ago, that my grandmother used to say ‘prevention is better than cure’ all the time. As most children do when a grown up says something wise, I rolled my eyes to heaven and carried on with whatever it was I was doing, paying little regard to the words or sentiment behind them.

As most adults do, I have come to reflect on those moments and those words (and regret my childish eye-rolling!). Of course, when we are older, our perspective on everything changes.

Working, as I do, in an area that involves law and its enforcement, I consider questions of harms and their prevention a great deal. No law in our Bailiwick is in place by accident. There are reasons that laws are put in place and there are outcomes that those laws seek to achieve. It seems clear to me that law should be a safety net, and enforcement a last resort – whether we are looking at drink driving or the protection of our data.

Regardless of the type of law, it seems obvious that the activities of those who have responsibility for overseeing it should be laser focussed on preventing harms rather than curing them once they have happened.

That sounds simple.

As a concept, it is.

As a reality, it is less so.

Where we have tangible harms – drink driving is a good example – we can clearly understand and engage with the risk. In turn, the steps we as a community need to take to reduce that risk are both understood and accepted. As with anything, there are exceptions. A small minority will get behind the wheel of a car having drunk too much. It is important that those people are sanctioned accordingly. The state puts mechanisms in place to do so and the community expects nothing less.

Where we have less tangible harms, we are generally less good at understanding and engaging with risk. Data is a good example.

Too much of the conversation revolves around the regulator and enforcement, and data protection is often framed as something ‘you need to comply with or you will get fined’.

It may sound like I am trying to do myself out of a job by saying that there is too much focus on the enforcement side of things because that is one of the functions of our office – to investigate complaints and take action where the law has been breached. But this is not a conversation about me or the Office of the Data Protection Authority. It is a conversation about you.

Your data may not feel tangible but it is very real and its impacts are also very real.

Do not be misled by the images of ones and zeros or padlocks when we are talking about our data. This simply de-personalises it and data is not to be de-personalised. Your data is you. It is inextricably linked to the core of who you are. It is linked not only to what you do (your phone knows exactly where you are every moment of the day) but also to what you think and feel (think about what your online activity/searching history says about you). Increasingly it is actually and physically you – your face, your fingerprint, your DNA. It must follow that what happens to that data is a deeply serious question.

So, what does prevention look like in data protection terms?

To answer that, we need to have an honest conversation about harms. There are lots of very important benefits to having our data processed properly – we can access online banking, we can communicate with friends and family, we can buy tickets for a concert etc. But, despite its perceived intangibility, the vast quantities of our very personal information that we leave in our wake in all those activities (and more) exist in very tangible form. Who has access and what do they intend to do with it? When we stop asking those questions, there is no pressure on organisations to handle our data properly and the risk of misuse increases dramatically. Openness and accountability sit at the heart of questions of trust and confidence – regardless of context. If we are to trust those that have our data (our doctor, the government), we have the right to expect integrity and accountability in the way they handle it.

Living as we do in a free and democratic jurisdiction, it can sometimes be difficult to envisage serious abuses of power or infringements of rights. I have frequently referred to the way that dictatorships and tech giants use and misuse our data to highlight actual and potential harms. Thankfully such examples are both geographically and conceptually distant from us here in the Bailiwick. But that does mean that we often take the rights and freedoms we have for granted - ‘that would never happen here, right?’

I want to challenge that complacency by highlighting a story that hit the headlines recently. The decision last month to roll back on abortion rights for women in the United States starkly exposes the reality of our data driven world, even in democracies.

Those of us with smart phones will often use a plethora of ‘apps’ for anything from online banking to social media. One increasingly common type of app is period trackers which help women monitor the cycles of their monthly periods.

After the overturning of Roe v Wade in the United States of America recently, I think we can all agree emotions are running high. Even where abortion clinics are able to still operate, the spotlight on those attending is shining brighter than ever. It has been interesting to see apps recognising this and reviewing their approach to data collection, anonymisation and retention to safeguard women at what could be a vulnerable time.

As with all businesses, they understand the impact if trust and confidence in them is lost. They are seeking a cure for the harm.

It is surely time to consider prevention more carefully. Individually and collectively, we can encourage a culture which takes care of our data, which takes care of us.