Transcript:

Nick James:

Hello, I’m Nick James, Founder of the #RISK Series Events that starts in London in November 2022. I’m delighted to be joined today by Steve Wright. Steve is one of the UK’s leading authorities on data protection, privacy and security. And after a career that spans 30 years working for organizations, such as Unilever, Bank of England, PWC, and Deloitte, three years ago, he set up his own business, Privacy Culture. Steve, welcome.

Steve Wright:

Lovely to be here, Nick. Thank you.

Nick James:

Really, really good to see you again. I just wanted to ask the first question actually around why you decided to call the business Privacy Culture.

Steve Wright:

Yeah, it’s a good question. It’s quite simple, really. It was born out of frustration. What I realized was, when I was Data Protection Officer and Information Security Officer for John Lewis Partnership, I spent a lot of money and a lot of time and put a lot of effort in educating and training people. And what was really a struggle back then and still is actually, was how do I measure what good looks like?

Steve Wright:

So I started to work on a kind of metrics, a framework, and that framework developed. And then when I started Privacy Culture, I put more and more time and effort into that, worked with a university to then devise a way of measuring people’s attitudes and behaviors, because what I felt was the risk really was predominantly us. We were the weak link in all of the strategies that I was putting through for Unilever and for John Lewis and the Bank of England. It was all about how the people aspect could let down all this great stuff, all this money that we were spending in technology.

Steve Wright:

So I came to the conclusion that it was about culture and therefore I set up Privacy Culture with the sole aim of measuring and benchmarking and then training the people to think differently, to act differently. So Privacy Culture kind of came to me one summer morning, and I just thought, yeah, that’s it. That’s actually what it’s all about. What I want to do is help organizations really embed a culture of privacy. So that’s why it’s called Privacy Culture.

Nick James:

Steve, thank you. So it does what it says on the tin then. I wanted to turn to environmental and social governance, ESG. Obviously it has sort of catapulted up the corporate agenda in recent years. Where do you see the role of privacy in ESG?

Steve Wright:

It’s a great question, Nick, because I think we’re going to see more thought leadership in this space. At the moment, it’s fair to say, there’s a bit of a lack of it. We’ve got this delta, this void between what we’re seeing out there or what’s out there already, and then what will need to be done. And so I am pleased to let you know that we actually launched a little project, a mini project to look at this and start to do some of that mapping.

Steve Wright:

What I can tell you is that if I take an example, the easiest example is retention and deletion. Now, we all know that that’s part of the regulations that you shouldn’t keep data for… Obviously it’s going to be on the correct lawful basis, but also, is it inappropriate? Are you holding or retaining the data for longer than necessary for the purpose of processing? And that question directly talks to retention and deletion guidelines, which is a big challenge for people in both cybersecurity and risk in IT and in privacy. And I think the ESG aspect of retention deletion can then be translated back to, well, if we’re buying all this extra space because as we know, disc storage is rock bottom, it’s really, really cheap.

Steve Wright:

What we need to be doing is thinking about what the environmental impacts of that disc storage space or that capacity is costing the environment. So, in other words, if I take the E, the environment side of it, what is the CO2 or the carbon footprint of retention and deletion? So it’s not just about therefore the legal aspects in privacy and in security and technology per se, but it’s actually about what’s the footprint. And actually you can translate that into many other areas about sort of, if you like your corporate responsibility, which some people say is what ESG is all about.

Steve Wright:

In terms of the rights, if you think about the society aspect of ESG, what are the rights that need to be upheld? So if we think very much about risk and we think about trust and how that relationship between what we do, what we say we’re going to do as an organization, and then how does that translate into some of those rights? So you can start to see the sort of tenuous relationship between environment, the social, and then obviously the governance side of it, especially around data is how do we ensure that the data management processes, the access to the data, where the data is, how we safeguard that from a governance perspective, and how do we give assurance to customers and to our employees that we’ve got the right controls and the right mechanisms in place to ensure that governance?

Steve Wright:

That also touches probably into your next question, Nick, which is around data ethics. So you can see that ESG in that very short example, can cover off quite a spectrum of specific articles and requirements in both the UK Data Protection Act and GDPR, as well as a whole plethora of other privacy laws around the world. Simultaneously, it also can be mapped to security standards such as ISO 27 series and/or the NIST framework. So we’ve started a project to look at that mapping, and we’ve got one mother of a Excel spreadsheet that’s currently been formulated, and we’re working with some leading academics and some people in the industry to try and pull this together and find out what that direct relationship is between ESG and DPOs and privacy and cybersecurity.

Nick James:

Wow. Steve, not only a comprehensive answer to a question, but you anticipated my next one. Ethics, we talked about, it’s obviously a major consideration in the world of GRC. I wanted to ask you a particular question about the implication of bias in artificial intelligence as AI and what should or can be done about it.

Steve Wright:

Well, again, it is a big question and it’s… The problem… Well, where to start with this one, Nick, because it’s a risk and it’s an inherent risk when you’ve got people creating systems and programs and designing them because those biases are in us. Again, this goes back to culture, where the environment that we’re brought up in, or the conditions that we’re subject to as we grow and evolve can sometimes formulate those biases and this translates quite rightly into certain AI and ML. And unfortunately, it’s… Well, it’s unfortunate, but it’s also fortunate, there can actually be some good things come out of that. And so I don’t think it’s not necessarily a negative thing, but there is definitely an argument for having an appropriate ethics framework when it comes to…

Steve Wright:

Well, put it this way. We created our own ML project here at Privacy Culture. We’re taking all that data, but we set the ethical framework around what was sort of deemed to be creepy or spooky or unethical. And we’ve applied that, and you can create checks to make sure that you’re not sort of going over those red lines, if you like, or you’re not sailing too close to the wind. But unfortunately bias in AI because they are programmed by humans will always be there. And it’s just about making sure good ethical framework is in place so you can check that. And that’s not complicated. It might sound it, but it’s actually quite straightforward. I think the important thing here, which you’ve alluded to is it’s all about risk, and it’s just about appropriate ethical procedures being in place to make sure that you don’t overstep that mark and you don’t go outside those risk boundaries.

Steve Wright:

And just the final thought on this, Nick, is for me, this is very much at the heart of privacy. I’m often asked, what the conflict is between being a cybersecurity expert and being a privacy expert. And the conflict, especially when I had that role was this one of ethics, because it’s one thing, safeguarding data and protecting data, and it’s quite another to think about ethically what you’re doing with the data. And is that right? And would I feel happy about that use of data, let alone the legality of it? Would it feel all right to me? So I think that all goes back to a question of risk and proportionality and making sure that it’s appropriate. And obviously that it’s lawful.

Nick James:

Steve, I mean, this isn’t a question, but I guess it’s an observation on what you said. If we’re looking at things that are being built by humans, then one of the considerations we need to make sure is that we have a diversity of humans that are creating these things so that everybody’s viewpoint is taken in at the beginning. And we’re not looking at it just from a white male programmer point of view, that will obviously throw up biases left, right, and center. And not politically left, right, and center. No.

Steve Wright:

I can’t agree more, Nick. I think that’s a good observation. It is. Diversity and inclusion is actually something that’s dear to our hearts at Privacy Culture. We’re very mindful of, like you said, just having that sort of stereotypical programmer, a 20 something, perhaps male programmer. As it happens, we have a data scientist, he is actually male. And we also have another one who is female. So we have that good balance. And that really works very, very well because when we are working on projects like that, we do come at it from different sides of the brain. And that does really come through, especially when they’re coding.

Nick James:

Steve, our strapline for #RISK is, risk is now everyone’s business. Do you think recent events have forever changed the global risk landscape?

Steve Wright:

Oh, that’s a very good question. Well, it depends which events you’re referring to, but I don’t think recent events change necessarily my thoughts or my processes. It just, it sort of affects my thoughts and my processes. And I think that’s the thing about risk, it’s inherent in all of us. Crossing the road, we do it without even thinking, but if we’re going to purchase a house or if we’re going to go somewhere on holiday, then events and circumstances which are happening now might impact that and might force you to think slightly differently about that. So I suppose, I don’t think it’s necessarily a new thing. I think we’ve been doing it for thousands of years. It’s just that what’s more acute now is that we’ve got better at articulating that risk.

Steve Wright:

And you got to remember, when I left university, I studied to be a trainee underwriter. So my first job was trying to look and understand the risk profile of these different organizations so that we could then come up with the right premium. That hasn’t changed and it came very naturally to me. And I’m sure it does to a lot of people. We try to do things and think about what the implications are and what the outcomes are and what could be good or bad outcome. And hopefully it’s a good outcome. So I think it’s just that the #RISK and, in fact, all of the good work that GRC World Forum do and other organizations do in this space just brings it front to mind and provides more clarity and examples of where risk is done.

Steve Wright:

Having said that, Nick, I still work with a number of blue chip companies. And there are things that happen in organizations, like blind spots to certain risks formulate. And so organizations can be happily thinking they’ve got the risk framework in place. And the rigidity of that risk framework is good in some respects, but actually in our sort of entrepreneurial kind of startup sense of the world, you need to take a lot more risks and you need to try things. And that’s how we learn. We try, we fail, we try again. But if we’ve got that aspect of a good risk formula to then articulate what those risks are, a bit like the privacy risk impact assessment, or the security impact assessment, that’s just a preemptive kind of what could go wrong, what if scenarios, that we should all be doing.

Steve Wright:

And I suppose my final point is just that I do believe that we’ve got a lot better at articulating that, but at the same time, there is a concern of mine that we’ve become so reliant on those risk frameworks, those methodologies that we’re missing. It’s sort of like the unknown unknowns, if you like.

Nick James:

Let’s not get Rumsfeld on this [inaudible 00:16:41]. We’ve talked a lot about governance, risk, ethics, security, privacy, and how they are all inextricably linked. The big question, I guess, is, do you think the boards, do you think the C-suite understands this and how can leaders therefore set the tone?

Steve Wright:

Yeah, I do actually think. When I was at PWC many, many moons ago, I remember talking about cybersecurity to the board. And it was really, it wasn’t called cybersecurity back then. It was hard. It was really, really hard. But I think now, cybersecurity, for example, ethics, certainly data is probably in the top five risks for most boards. I think the challenge, however, Nick, it still remains that… In fact, I read over the weekend that there’s a potential piece of regulation coming through from the SEC, which is great for regulated businesses, but not so great for the unregulated, are thinking about making some of those requirements mandatory for all organizations in terms of cyber risk, et cetera, which is very interesting. But I think we still have a little bit of a gap and that gap specifically is around ownership.

Steve Wright:

I had the pleasure of working at John Lewis Partnership. And one of the first tasks that we completed was to get the relevant data owners on the board, like the chief HR officer, the chief marketing officer, and the chief finance officer for pension data, et cetera, responsible and accountable. And I think that’s where we’ve still got some way to go. And I think in the privacy sector, especially, less so on the cybersecurity there is now a board member that has responsibility for cyber.

Steve Wright:

But I think to answer your specific question about can leaders set the tone? Absolutely. I think there’s much more scope for leaders of businesses to come out and say, this is our position on data. This is our position on trust. This is our position on cybersecurity and be bold and brave enough to say that. And I suppose the only example is, we’ve had organizations like Alan Jope from Unilever, come out and say, “We’re no longer supporting Russian businesses by X, Y, and Z.” That’s a bold, brave statement to make given the size of Unilever and the market of Russia. But he and the board obviously felt compelled to say that and to make that statement, which is a good thing. I just feel that there’s not enough of that. And especially around data privacy, I would like to see more of that leadership of that board actually stand up saying, look, we really take privacy seriously. We believe in trust. And this is what we mean by that. This is what we mean by data ethics. This is what we mean by good data practice. Yeah, that’s definitely lacking.

Steve Wright:

But I’m hopeful. I am very hopeful, Nick, that because the cybersecurity industry was in this position now about 10, 12 years ago, I’m hoping that more of that will come. Just like my early days when I was at PWC, when we would go and talk to boards and you could see them glass over and they weren’t really interested, now, they’re very interested. And I hope the same will happen with data privacy and data as a whole. And I think that is starting to happen, but I’d like to see more of it.

Nick James:

Steve, thank you for that. I wanted to turn to something that you at Privacy Culture last year launched the first Employee Privacy Culture Survey. I was wondering if you can share some of the key findings with us.

Steve Wright:

Yeah, sure. Thanks for mentioning that, Nick. That was a really exciting piece of work. We started a year before in the pandemic, so the timing wasn’t great, but we managed to go out and bring a dozen different organizations from different sectors globally, representing thousands and thousands of employees across 54 countries. We completed the survey, a pilot across all those different organizations. So we had, like I said, different sectors, different languages to overcome, and the results were startling. And the good news, well, I won’t let the cat out the bag yet, but the following year, the first year anniversary is coming up, so we’ll be releasing the next 2022 Global Privacy Culture Survey imminently.

Steve Wright:

But talking about last year’s, yeah, there were three or four, which really came through strong and you won’t be surprised by those. Top was risk management. What specifically we found in our survey amongst all those people that entered was that the staff did not know or did not feel empowered around privacy risk and privacy impact assessments and in particular on privacy by design. So that’s quite crucial because, if you like, that’s what stops the water coming in, in the first place. So it wasn’t embedded, it wasn’t operationalized, people weren’t trained on it. They bought tools, which did what they said on the tin, but it just wasn’t landed. And it still isn’t, I have to say.

Steve Wright:

And I think also, was around training and just the lack of it. And what was the biggest finding there was it was inconsistent. It was bland and boring and it was just literally one size fits all. And so people were just doing it as a compliance exercise. The other, which I’ve alluded to on the ESG question was around retention and deletion. So big problems around that naturally, because retention deletion is particularly hard when you’ve got legacy systems, some of which are unsupported, have come out of contract, but you’re still using them, or you’re holding onto that data in various different forms.

Steve Wright:

And I think the other one, which was less of an issue, but was certainly bubbling up, was around transparency. So in other words, again, going back to our rush to get ready for GDPR in 2018, there was a lot of privacy notices, privacy policies, a lot of cookie activity, but what wasn’t really translating down into the business, down into the organizations was this living and breathing of those privacy notices. Again, they weren’t sort of operationalized. They were almost pieces of paper that they stuck on the website and said, yeah, we take privacy really seriously. But actually, when it came to it, it wasn’t coming through. People didn’t feel empowered. People didn’t know where to go to find out information, data subject access requests were being lost between the cracks because their customer services teams hadn’t been trained or retrained and all these kind of issues. So yeah, generally it was just a fascinating survey. We’ve committed to doing that for the next 10 years. And so we’re just coming up into year two.

Nick James:

Excellent. Thank you, Steve. I have one final question. And this is from a business point of view, not a personal point of view, and I’m asking this to a lot of people, but the question is, what keeps you awake at night?

Steve Wright:

Well, I suppose, I mean, be it that I’m in business now, what keeps me awake at night, I suppose is, well, two things really, cashflow because that’s a problem we all suffer from. A lot of our clients, in fact, the majority of our clients are big global organizations, they take a decade to pay. So cashflow is a real challenge. What’s 60 days or 30 days for them is a long time for a small business.

Steve Wright:

But actually the other one, which really does concern me more so is more about sort of where we’re heading in terms of our society, not just the UK, but just globally. I’ve got young adults now, and there’s this big disjoint between, and I think it’s brought on by this almost self-mentality image and what you look like on your persona of your phone. And I feel that I worry that our kids have learnt or can’t identify with who they are and what they are as human beings. I mean, that’s a bit philosophical, I know, Nick, but that’s the thing that worries me now, that young people don’t really know who they are anymore, and that they’re all trying to be personas of something else.

Nick James:

Steve, fascinating. And I sort of get exactly what you’re saying on that last point. And I don’t think it’s just the youngsters-

Steve Wright:

Maybe not.

Nick James:

It’s everybody. Steve, thank you so much. Let’s put something in the diary to catch up and meet in real time quite soon.

Steve Wright:

Would love that. Thanks so much, Nick, and to the whole team for everything you do down there. It’s brilliant. And we really love working with you guys and coming to your events. So thank you for bringing this to the table.

Nick James:

Fantastic. Thanks, Steve. Cheers.

#RISK Founder Nick James in conversation with Steve Wright