Blog:

Equality and diversity in tech

Published: 1 March 2021

This blog was written by our commissioner Emma Martins, as a contribution for Business Brief's 'female leadership' issue in March 2021. 

"I am an avid collector of books, as anyone who knows me will attest. I love reading and wish I had more time to bury myself into the pages of the many publications that sit on my shelves. It is of course far from being scientific, but it has often struck me how much certain genres of books reflect broader society. What I mean by that is when you look at the sports or business section in a bookshop for example, how many of the authors/subjects are male, female, white, black etc.? Does that say anything more broadly about our lives and our society?

I work in the field of data protection and I am occasionally prompted to reflect on the opportunities that I have had as a woman and the professionals I am now surrounded by. I think the fact that such reflections are only occasional speaks volumes. I feel extraordinarily fortunate to work in a profession that is unusually diverse. I don’t say that just because the books about data, AI, etc that I have on my own shelves are written by men and women from all kids of different ethnic and cultural backgrounds; I also say it because every day I can see that equality in action. Some of the most powerful and influential data regulators are women (UK and Irish Commissioners for example). It’s not something that is foremost in my mind until questions about equality arise. But when those questions are posed, it is interesting to explore them a little deeper and think a little more about why this environment seems to be much better served in terms of representation.

I don’t pretend to have any profound observations beyond my own personal experience but one thing that strikes me about the world of data is the way in which it has evolved in recent years and I think this can help us, at least in part, make a start at unpicking some of the reasons why diversity in this area is impressive.

In this digital era, technology has become a fundamental part of our lives. It has transformed the way we work, play and communicate and at its best big data and AI can improve and even save lives. This technology allows us to innovate for the future in new and exciting ways. It also provides a platform for malign uses so first and foremost we must focus on innovating and using technology with human values sitting right at the heart. We need to become more intelligent about our understanding of ‘progress’ because we must move on from approaching it from the perspective of what technology CAN do, to what technology SHOULD do.

Innovation is something to be celebrated but it must never be seen as something which happens at all costs. The rapid speed of technological change in our modern world offers extraordinary potential and poses enormous risks.

Let’s think about other areas of our lives, for example air travel and vaccines. Ensuring safety is built in from the outset, the existence of independent regulatory oversight, accountable and transparent processes; these are all accepted as entirely normal and indeed essential; there is uproar when any one of these elements does not function properly. It is therefore perplexing that when we start to talk about data and data harms, that this acceptance of, and desire for, clear harm reduction strategies and controls seems noticeable by its absence.

We have recently started to better understand the reality we are facing; that our data and our attention is viewed as a natural resource to be exploited and profited from. But this is being increasingly challenged because that is so fundamentally misaligned from the human values that so many of us believe in. That, in turn, has expanded the conversation from a small group of technologists (who themselves have a certain and specific age/race profile – remind yourself of who built the big tech platforms that now so dominate our lives and our economies) to wider society.

This manipulation and exploitation of our data has human consequences and human impacts, we therefore need human answers; answers that allow for outcomes to truly reflect the values and cultures of those who will be affected. It is simply a matter of fact that the activities of the data economy affect us all, regardless of gender, race, age or background.

Technology and innovation are things that have in the past been the almost exclusive domain of a handful of people. That has had profound and far-reaching consequences on many areas of our lives. I do not think that those individuals necessarily set out to do harm but because something is not intentional does not mean it is unforeseeable. Mistakes have been made but wringing our hands in despair will not help. Mistakes are forgivable but not learning from them is not. Now is the time to look forward with energy and focus having learned lessons from the past. Having an increasingly strong representation of women in this area is something to be celebrated but it is only a part of the picture. The duty to realign the goals of the data economy from being exploitative to being ethical is ours individually and collectively. We are all involved, and we all have a voice but there are still too many voices which are not heard.

Those of us with a voice must use it both to speak for ourselves but also for those who struggle to be heard. There is much good work being done and we are seeing really positive changes. We need to celebrate those changes but never stop fighting for more because doing so will ensure that our future, and that of our children, will have equality, diversity, fairness and morality as its foundation stones."