Through Wednesday 16th and Thursday 17th November, over 300 subject-matter experts will be sharing their insight through panel discussions, keynotes and breakout sessions, to give #RISK London attendees an exclusive deep-dive into the big issues impacting risk today.
#RISK London speaker, Michelle Griffey is Chief Risk Officer at Communisis, a role that sees her responsible for Operational Risk, Business Continuity Governance, Data Protection and Information Security Governance.
With a background in Financial Services, Michelle uses her experience to bring a practical and pragmatic approach to risk management, balancing the needs of highly regulated clients with the practicalities of business growth.
Michelle appears on a panel at #RISK London to explore privacy supply chain risks and how organisations can get the best out of collaborating with data sub-processors.
We caught up with Michelle to get an introduction into this important theme and to learn about her professional journey so far.
Could you outline your career pathway to date?
I began my career with Lloyds Banking Group, and I did a multitude of frontline-looking customer management. I worked in group risk in Lloyds as well, working in product and policy governance, and then went into Assurance and change governance. In my current role, I built the enterprise- wide risk framework for Communisis, and also look after data protection as the Data Protection Officer.
Could you give more detail on how risk goes up as the number of data processors increases?
If you have one data processor then you have your pool of data or sensitive personal data sitting in one place. You can wrap your arms around it, wrap your tools around it and understand what’s going on with it.
The more data processes you put in place, the more your visibility over that data can diminish, and the less you know about what’s going on with the data, unless you have strong oversight of the sub processors.
You can put audits in place and you can execute compliance checks but with the best will in the world, they’re not necessarily going to be ongoing at all times, in all places. Therefore, the data landscape that you’re looking at becomes more detached.
What practices should organisations be implementing as they bid to maintain visibility over data handled by sub-processors?
You need good supplier assurance routines when you on-board the suppliers to make sure that they have the right tools to protect the data and keep it safe and secure.
When you are using sub-processors for personal data, make sure that these assurances are ongoing wherever possible. It’s critical to ensure that sub-processors have the correct tools in place, but it’s equally important to have established good relationships. A healthy, positive relationship should engender trust.
If something goes wrong, or if a party suspects a gap exists, the sub-processor is far more likely to be proactive in sorting the issue out. Conversely, a poor relationship is likely to result in people keeping things to themselves if issues arise.
Are there cultural or structural changes that organisations should make in order to improve risk mitigation in the long term?
I think there are structural changes that need to be made. But it’s quite interesting to combine historically separate teams. In our organisation, we have InfoSec governance and data protection; we have operational risk and business continuity all sitting under the umbrella of a risk team.
You get a lot of insight where crossovers happen and people work together. You can also reduce risk of sending out questions to people working in the same areas, thus annoying workers in other areas of your organisation.
Incident Management, for example, fits into all of these areas, but if you can use the insights you get where structures overlap, then that’s really good and cuts down on duplicated processes.
From a cultural perspective, it’s more a case of winning hearts and minds. I think risk and data privacy has been directly something for compliance or risk teams to take care of, and other people can miss the importance of it. If you get people to realise how much these issues apply to them in the real world, then you’ll get more on board.
We could be talking about data on how you spend your money, what you do on the weekends, how much you get paid, etc. By framing it this way, people start to understand what data privacy means in reality, and this is crucial to getting good cultures in place.