Showing that the collection of data through the NHS app is ethical and transparent is crucial to its success this winter, writes Evelina Georgieva

Trust is the foundation of all good relationships. When we trust people, we are happy to share things with them and we are more likely to do what they ask of us.

In the case of building a successful contact-tracing app for Covid-19 – something which academics at the University of Oxford in the UK say will only be effective if at least 60% of a population is using it-– this trust could be the difference between life and death for many people – and businesses – this coming winter.

The UK contact-tracing app has been dogged by more criticism than most since its inception and much of the negativity – at least in the earlier stages – stemmed from concerns about data privacy. However, while the old adage says “‘you never get a second chance to make a first impression”’, I firmly believe that when the alternative could be national lockdowns, loss of businesses and an overwhelmed healthcare system, the UK public will be willing to offer a second chance to the new NHS Covid-19 app to prove its trustworthiness.

A potted history of data privacy disasters

While this article also explores South Korea’s more privacy invasive approach to Covid-19 track and trace, in the UK, the GDPR regulations and cultural expectations mean that a transparent approach to data privacy is a central requirement for building trust in digital applications.

Unfortunately for the app’s developers, its first iteration was based around a centralised approach to collecting data, with information to be stored on an NHS database. This led to a joint statement from 170 scientists opposing the British app’s design.

Privacy experts also warned about “mission creep”, where the original purpose of the data collection can change, and there were accusations from some media that the UK was sleepwalking into a data privacy crisis.

There was further controversy when commentators suggested that the UK Government’s entire Test and Trace programme wais unlawful as the Department of Health and Social Care (DHSC) failed in its legal obligation to complete a Data Protection Impact Assessment (DPIA).

While the average person might not understand the complexities of data privacy law, these stories were covered by major UK media like the BBCThe Guardian and the Daily Mail. This kind of storm sends a strong signal to the public that “something isn’t right” and is likely to have significantly undermined public trust in an app that has so far been downloaded by only around 25% of the England and Wales population.

Data privacy should be foundational – not an afterthought

Speaking as the co-founder of Pryv, a company that leads the way in personal data and privacy management software, I’m passionate about the importance of building digital health solutions on a solid data privacy foundation that creates trust between the software and the user.

Common reasons for software developers failing to meet privacy regulations in their applications are a desire to get to market before competitors and a lack of legal understanding.

In the case of Covid-19, the desire to publish the application as quickly as possible was, understandably, heightened, due to the enormous health impact of the infection.

However, as we have seen, failing to embed solid data privacy foundations from the beginning can lead to scrutiny and criticism that undermines trust further down the road -– and it really doesn’t have to slow things down at all either. Technology solutions like Pryv provide ready-to-go middleware on which to build digital health applications. This enables peace-of-mind for developers that the tools they create will meet all data privacy regulations and means that software can be built on these foundations in record time. In our case we have even introduced an open source solution alongside our enterprise-class version, meaning this technology is more accessible than ever.

The issues with the NHS Covid-19 app could also have been caused by a failure to involve legal experts. Since the adoption of GDPR, the technology industry has expected its developers to embed privacy solutions into software as standard.

However, studies have shown that, understandably, this is not an area where developers feel confident. We don’t expect lawyers to be able to implement technical solutions and so how can we expect developers to be legal experts? In reality, the best results happen when these subject matter experts – lawyers and developers – work together to build digital solutions that are both legally rightful and technically feasible. That is how we approach our work at Pryv and we would expect for NHSX and the DHSC to do the same.

A widespread issue in digital health

Although the NHS Covid-19 app is a very public example, it’s worth noting that, unfortunately, data privacy issues are common in the health application space in general. Only last year the British Medical Journal wrote that many health apps “pose unprecedented risk to consumers’ privacy” and accused developers of a lack of transparency when it comes to data sharing.

We believe that this is because many healthtech companies approach the issue of data privacy and consent from the wrong angle. While gaining user consent is a legal requirement for businesses who want to collect, use and share their customers’ data, thinking about it as an inconvenient necessity removes the chance for companies to use this touch point as a way to build trust.

To achieve this, there needs to be a clear Personal Data Lifecycle that meets regulatory requirements, handles users’ data requests and can provide proof that users’ rights and consent have been executed correctly.

Consent must be freely given, meaning customers should clearly understand the benefit of sharing their data and how it will be used and stored. Why should they give their data away without understanding the benefits -– and the risks? Consent should also be dynamic, meaning that customers are free to revoke it at any time as part of a two-way conversation between users and data collectors.

A crucial time for the NHS Covid-19 app

As we head into the Autumn and Winter seasons there is a real risk that, with the inevitable surge of Covid-19 cases set to occur alongside increases in other seasonal conditions, like flu, healthcare providers could risk being overwhelmed. The fact that this situation is set to arise just as experts warn of pandemic fatigue, with fewer people supportive of lockdown measures, and alongside big drops in profits for UK businesses that could cause some companies to go under, makes this an even more crucial time.

While South Korea has brought its number of new Covid-19 cases under control through measures including a contact tracing system that relies on surveillance of the infected person’s phone and bank statements, it’s clear that its approach would never be tolerated by European populations. In Europe, the direction of travel in the data privacy space is towards openness, and so the developers of the NHS Covid-19 app must bear this in mind if they are to win trust through an ethical, transparent and honest data privacy approach.

If the UK can increase trust in the NHS Covid-19 app and secure wide adoption across its population then it has the potential to provide the protection method that will enable the country to avoid national lockdowns, allow businesses to keep going and protect the capacity of the healthcare system. Those are the kinds of outcomes its population wants.

If it can prove that its approach to data privacy is ethical and transparent while also demonstrating that widespread use of the app could prevent some of the more draconian measures currently being explored, I think the UK public might just give the app the second chance it needs.


Evelina Georgieva is the co-founder of Pryv