VIDEO: Doing the right thing (can be difficult)

Decisions in life are rarely black and white. So, when data protection professionals Rowenna Fielding and Stef Elliott found data was being collected and shared with social media platforms via a tracking tool embedded in the website of a British police force, they were faced with a difficult ethical dilemma. 

Should they sound the alarm that people using the website to report crimes were sharing that information with platforms like Meta and Google without realising? 

Or should they remain silent so as not to risk putting people off reporting sexual offences, domestic abuse and other serious crimes? They talked to the ODPA’s Katherine Levy about what happened next.


BIO

Data protection consultant Rowenna Fielding and digital marketing professional Stef Elliott are both passionate about using their skills and knowledge to help reduce data harms.

Digital engineer and data protection specialist Stef Elliott is a business and marketing professional with more than 30 years of experience across a range of industry sectors. He is the founder and managing director of Six Serving Men, a digital marketing company, and has worked across several continents with a wide range of clients , including the BBC, KLM, Boots, Towers Watson, Top Shop and The Really Useful Group.
His active involvement in data governance and protection began in the early 1990s when he recruited his first data miner. He is passionate about continual development and is a regular speaker for the Direct Marketing Association, Internet Advertising Bureau, Institute of Direct Marketing and the Institute of Directors regarding data & digital marketing. Stef also works with charities on a pro-bono basis to explain data protection issues.

Data protection, ethics and privacy consultant Rowenna Fielding describes herself as a nerd whose obsessive tendencies have served her well in building a career; first as an information security specialist then an advisor on data protection. Over the 10+ years since switching fields, Rowenna has been helping to bring data protection law to life in commercial and voluntary sector organisations, from in-house and more recently as a consultant.
In 2020, Rowenna established her own company, Miss IG Geek Ltd; providing advice, support, training and guidance on data protection and eprivacy. If she won the Lottery, Rowenna would carry on working in data protection, because its intersection of technology, social order and human rights is just too interesting to miss out on, even if it is a bit of an uphill struggle most of the time.

Source: Revealed: Metropolitan police shared sensitive data about crime victims with Facebook | Metropolitan police | The Guardian



Key points:

02:34 (Stef) “Part of me was concerned by publicising this, there would be a backlash, or if this got out people who need care and help because these are sensitive sites, people who’ve suffered an alleged sexual assault I was concerned that these people would not feel comfortable there and therefore not getting the care and help they require.”

05.41 (Rowenna) “Obviously the third parties to which the data was going were a matter of serious concern. The usual suspects were there, Meta, Google…one might say what’s the harm but actually, because of the way that programmatic advertising works with keywords and inferences it’s completely realistic to expect that as a result of that, people might start seeing ads for, at the most benign end, for crisis support but other ads for things like targeted anti-pregnancy termination, we know that is being paid for and targeted by certain groups, stuff relating to an immensely traumatic experience that is absolutely nobody else’s business, just for clicks.”

07:19 (Rowenna) “It was a bit of an ethical dilemma because on the one hand, one doesn’t want to put people off from reporting. Underreporting is already a massive problem but on the other hand, I felt very strongly that it wasn’t right to leave people in the dark and let them continue being exposed to significant risks without knowing. They should be able to make an informed choice.”

10.30 (Stef) “Firstly I anticipated once we went to the regulatory authority that this would be something that would immediately be a cease and desist order. I haven’t had confirmation one way or the other whether that has happened. Secondly, we went to the media and the story that ran was predominately regarding Meta and the Met, maybe because they are two areas that they chose to focus on, I’m not a journalist so don’t fully understand that. The third is that we went to local MPs and local police and crime commissioners… The thing that we haven’t had yet, from the off I’ve said that the people involved with, we’ll explain our concerns. We haven’t had anybody come to us and that is where we have lost this degree of self-policing. Ideally I would have hoped that the regulator would have solved this, I would have hoped that the person in charge of the websites would have solved this.”

12.22 (Stef) “One of the responses we’ve got back from my MP is that no personal data was involved in this. We’ve had to go back and say, well have a look at that case, have a look at this case, we’ve provided a whole list of reference information and that’s what I find frustrating, that we still haven’t got confirmation that this has been switched off. We still have some concerns with the website.“

14.15 (Rowenna) “I think it would be over-simplistic to say that someone hasn’t done their job, I think the problem is deeper than that. I think the problem is that it doesn’t seem to be anyone’s job. Organisations have policies that say it’s everyone’s responsibility to do data protection, but then executive management don’t get hands on with strategy and risk appetite, middle management don’t get hands on with monitoring policy adherence and feedback loops where the policy is working and frontline staff get a long, boring document to read on day one of their employment and then basically chucked in at the deep end. What usually happens is the DPO doing the work of 12 people on the salary of half a person. They are battling with enormous knowledge gaps.”

17:32 (Stef) “It’s about caring. You’ve got to care that there is an individual here. There is an individual here. People forgive and forget, the internet does neither so by people going on this site, this data is being collected...you could call it captured, you could call it harvested. Data points are being collected regarding an individual and that data is now going to reside somewhere. If you are not aware that you are collecting the data, you are not aware that you need to delete the date. You’ve got no retention policy, you’ve got no understanding of where it’s going to impact. And that impact might not be today, that might not be tomorrow but when you get machine learning of data points, that data point is likely potentially to be utillised  in an algorithm that somebody doesn’t understand to impact somebody going forward.  That person has lost agency of data they didn’t even realise was being collected. “

20.12 (Rowenna) “There is a real tendency in organisations to assume that because they are all nice people doing nice things, therefore everything they are doing is good and nice. They don’t understand that they are actually playing with most dangerous stuff that humanity has ever invented and that they have to make an effort to direct their actions towards consciously being safe and respectful because if they don’t make that conscious action, the outcomes won’t be those things and that’s not really sunk in yet.”

24:35 (Stef) “The majority of websites will have considered using Google analytics and Google ads, they would have considered using Facebook Pixel, they’ll be looking at TikTok, they’ll be looking at these organisations, saying we can get enhanced reporting in this service, but they often don’t think, what are we exchanging for this additional information and value.”

25:36 (Rowenna) “If you want to understand the privacy implications of ad tech technology, don’t look at their privacy information, it’s worthless, look at their marketing information and sales and what they can do and their integration guides because that is where the truth is.”

26:52 (Stef) “This technology, if used responsibly and transparently might suit for instance a fashion retail brand. If you are then working on a fashion retail brand then moving onto a service for example, to enable people to report sexual offences, don’t use the same technology. There is this awareness piece that needs to help.”

Editor’s note: Rowenna is pronounced Ro-enn-a not Ro-wee-na.