The UK Online Safety Bill may have “implications for freedom of expression and privacy” as private messaging may fall within the scope of the regulatory framework, a digital rights group has warned.

On 15 December, the UK government published its full response to the Online Harms White Paper consultation ahead of the publication of the Online Safety Bill next year. The bill in its current form would represent a divergence from EU law, which prohibits general monitoring. The new bill aims to establish a “duty of care” for online platforms to remove illegal and harmful content.

The Bill has been criticised by privacy advocates for including interpersonal communications as well as closed social media groups within the scope of the Online Harms framework. Ofcom, which has been appointed as the regulator, will decide how companies can fulfil their “duty of care” in codes of practice.

Open Rights Group has criticised the inclusion of private messaging in the regulatory framework as it undermines the user’s right to end-to-end encryption and privacy.

“If encryption is used to protect private messaging, then companies may need to show that illegal content is nevertheless being dealt with,” says Open Rights Group. “Thus, encryption and privacy are set as a privilege dependent on wider corporate policies rather than something we are entitled to.”

In contrast, earlier this month the UK’s Children’s Commissioner warned in her new report, that end-to-end encryption will put children at risk of grooming and abuse and prevent the sharing of reports with law enforcement.

“End-to-end encryption makes it impossible for the platform itself to read the contents of messages, and risks preventing police and prosecutors from gathering the evidence they need to prosecute perpetrators of child sexual exploitation and abuse,” Commissioner Anne Longfield said.

In its published White Paper, the UK government says that “highly accurate” automated detection systems will be used to identify and remove illegal content.

It says: “…the new regulatory framework would increase the responsibility of online services in a way that is compatible with the European Union’s e-Commerce Directive, which limits their liability for illegal content until they have knowledge of its existence and have failed to remove it from their services in good time.”

Ofcom will also be given greater powers, as outlined in the White Paper: “The regulator will also be given an additional express power in legislation, to require a company to use that technology to identify and remove illegal terrorist content from their public services where this is the only effective, proportionate and necessary action accurate at identifying only illegal content to minimise the need for human review of legal content.”

The UK government also adds that due to the accuracy of the technology, it will be “unlikely to identify illegal content that does not constitute an offence relating to terrorism.”

In contrast, the EU’s own new law, the Digital Services Act, which aims to regulate the obligations of digital services that act as intermediaries, maintains the prohibition of general monitoring.