We are very happy to announce that Data Protection Commissioner, Emma Martins will speak at PrivSec Global, coming soon.

Emma Martins headshot 2023

Emma Martins is Data Protection Commissioner at the Office of the Data Protection Authority.

Streaming live November 29 and 30, PrivSec Global unites experts from both Privacy and Security, providing a forum where professionals across both fields can listen, learn and debate the central role that Privacy, Security and GRC play in business today.

Emma Martins is Data Protection Commissioner at the Office of the Data Protection Authority. Before taking on her current role, Emma was Information Commissioner for the Channel Islands.

Emma appears exclusively at PrivSec Global to discuss emerging technologies and the challenges societies face as we embrace facial recognition tools.

Below, Emma answers questions on her professional journey and the themes of her PrivSec Global session.

  • Technology: The problem with facial recognition regulation - Day 2, Thursday 30th November, 10:00 - 10:45am GMT


PrivSec Global

Could you briefly outline your career pathway so far?

I started my data protection life many years ago as a DPO in the public sector. From there I went to work in the regulators office and have worked in data protection regulation ever since. Since 2018 I have been Commissioner for the Bailiwick of Guernsey and my fixed term contract ends at the end of this year.

What are the primary societal risks posed by facial recognition technology?

Any biometric data needs to be considered through a different lens because it is fundamentally more than data, it is ‘us’. We cannot change it as we would a password, and we cannot keep it from the public gaze. 

The Clearview AI exposé (Kashmir Hill) illustrates all too graphically how baked-in racial and gender biases can have enormous impacts on those who are often least able to protest or stand up for their rights. Harms done are often impossible to undo. As with all technologies, we need to ask the important questions before they are rolled out. 

Too often we are trying to retro-fit data protection, governance and ethics into systems long after they have become embedded into many different areas of our lives. The trajectory is not set but if we do not engage, and if we do not stand up for governance and ethical standards, then it will be set by others. The power balance is not in our favour, so we need to unite as data protection professionals to create a trajectory which values human rights and protections, dignity, and autonomy.

What are the primary challenges facing lawmakers as we attempt to regulate the technology and mitigate these risks?

It has always been the case that laws largely lag behind. That is understandable because laws are often about risk/harm prevention and those risks and harms often have to manifest themselves in order for the state to react. 

In the past, developments have occurred over years and decades and laws have had the time to respond. The speed of change in the fourth industrial revolution means that law is having a hard time keeping up. It points to the need for something more than law. Law is a safety net but if we build a culture whereby citizens and consumers demand the protection of their personal data and companies understand that by responding to that demand, they will build more trust and confidence, we can aim for a virtuous circle. 

We are not there yet and we do need robust laws in place, but we also need to engage the whole community. Technologies can do great things and make our lives better and safer in many ways; but they also have the potential for real harms, especially to those who have the least power in society. 

Those of us working in this field need to focus on both ensuring a strong and effective legal framework, as well as an enlightened and empowered community. Those two things combined is where real change can happen. We must stop looking at this as though it is a question only for big tech or the regulators. It is a question for each and every one of us. The risks and the harms are real and we cannot afford to wait.

Don’t miss Emma Martins debating these issues in depth in the PrivSec Global panel: Technology: The problem with facial recognition regulation.

Facial recognition technology is now commonly used on most personal devices and to help criminal investigations, and it’s one of the most advanced technologies of recent years. 

However, following the European Parliament’s recent passing of its version of the Artificial Intelligence Act, the use of facial recognition remains one of artificial intelligence’s riskiest uses. 

As facial-recognition tools are full of biases, lawmakers are ready to fight to avoid any risk of mass surveillance. Maybe the solution lies not in banning facial recognition or any one type of biometric, but in regulating how we consent to the sharing and tracking of identifying data.

Get up to speed on the key issues, only at PrivSec Global.

Also on the panel:


  • Session: Technology: The problem with facial recognition regulation
  • Time: 10:00 – 10:45am GMT
  • Date: Day 2, Thursday 30 November 2023

Click here to book your place at PrivSec Global today

Discover more at PrivSec Global

As regulation gets stricter – and data and tech become more crucial – it’s increasingly clear that the skills required in each of these areas are not only connected, but inseparable.

Exclusively at PrivSec Global on 29 & 30 November 2023, industry leaders, academics and subject-matter experts unite to explore these skills and the central role they play within privacy, security and GRC.

Click here to register

PrivSec Global