Ethics and compliance programs have more data available to them than ever before. Capturing data on program activity and impact can be a powerful tool to assess whether your ethics and compliance program works in practice. Many companies also find value in benchmarking their programs against industry norms and in response to risks. That said, data’s nascency in the ethics and compliance space means we are still struggling to understand what to do with all this data, and how to know what it is telling us.
A standard practice from other global privacy laws has hit the US: privacy risk or data protection assessments (commonly known as PIAs) are now required under certain criteria for the processing of personal data in all new state laws, except Utah. Beyond regulatory compliance, a comprehensive and integrated assessment program embeds privacy by design into your organization’s data strategy and enables you to manage risk at scale.
Regulators worldwide are getting serious about children’s privacy. From the “children’s codes” established in jurisdictions like the UK and Califorina, to the recent enforcement action against Instagram and TikTok—it’s becoming increasingly clear that the web is likely to change so that children are better protected.
Much of the work of security professionals focuses on securing an organisation’s perimeter and keeping malicious actors out. But one of the most significant threats to privacy and security is accidental disclosure of data by employees.
The work of privacy and security professionals looks quite different day-to-day. But these two disciplines share a lot in common, and there are areas where working in silos does not best serve the interests of employees, organisations or even users.
We all know IT - the laptops, phones, applications, and cloud applications we use at work and home, that manage information. Systems that run industrial systems – electric utilities, gas pipelines, water systems, and manufacturing plants – are Operational Technology (OT).
The role of Chief Privacy Officer (CPO) has existed since the early 90s, but CPOs have become increasingly commonplace as companies use more personal data in increasingly innovative—and sometimes risky—ways.
Organisations find themselves working with an ever-larger network of third-party companies. And much of the work of managing these third parties falls to privacy professionals: from preventing data breaches to drawing up data processing agreements and facilitating international data transfers.
Schrems II affected many companies’ ability to transfer data from the EU to the US. Meta has repeatedly stated that an order to stop transfers could force the company to stop offering Facebook and Instagram services in Europe.
Data subjects are becoming increasingly aware of their data rights (Article 15) through to automated individual decision-making (Article 22). The challenge is, how do data controllers continue to meet the increasing demand for such requests.
Personal data can be an organization’s most valuable, but also riskiest type of data. This data is governed by an ever-evolving regulatory landscape as reflected by the complexity of managing cross-border data transfers. Most recently the Schrems II case which has highlighted the direct conflict between US surveillance law and EU data protection.
Data protection practitioners in the UK have had a rocky couple of years. From Dominic Cumming’s 2018 comments on “binning” the “idiotic” GDPR, to the TIGRR Report, the DCMS consultation and the now possibly-binned Data Protection and Digital Information Bill.