The Swedish Authority for Privacy Protection (IMY) has ordered the country’s police authority to pay a SEK2.5m ($297,000, €247,000) financial penalty for incorrect processing of personal data when using Clearview AI’s facial recognition technology to identify individuals.
Following an investigation after media reports of the force’s use of the company’s facial recognition technology, IMY concluded the police did not fulfil obligations as data controller in a number of cases concerning its use of Clearview AI.
The police failed to implement sufficient organisational measures to ensure and demonstrate processing of personal data was carried out in compliance with the country’s Criminal Data Act.
“When using Clearview AI the police unlawfully processed biometric data for facial recognition as well as failed to conduct a data protection impact assessment, which this case of processing would require,” IMY said.
According to the police, some employees used Clearview’s application without prior authorisation, it added.
“There are clearly defined rules and regulations on how the police authority may process personal data, especially for law enforcement purposes,” said Elena Mazzotti Pallard, legal advisor to the DPA. “It is the responsibility of the police to ensure that employees are aware of those rules.”
In addition to the fine, IMY ordered the police to conduct further training and education of employees to avoid any future processing of personal information in breach of data protection rules and regulations.
The police also must inform data subjects whose information has been disclosed to Clearview AI, when confidentiality rules allow, and erase any personal data transferred to Clearview AI where possible.
Earlier this month four of Canada’s privacy commissioners announced an investigation they conducted had concluded the company’s scraping of billions of images of people from the internet was mass surveillance and a clear violation of Canadians’ privacy rights
New York-based Clearview AI says its facial recognition search engine is available only for law enforcement agencies and select security professionals to use as an investigative tool. The app contains only public information from the open web.
The company claims its technology has tracked down hundreds of at-large criminals, including paedophiles, terrorists and sex traffickers. It has also helped exonerate the innocent and identify victims of crimes including child sex abuse and financial fraud.
Register for free to receive the latest privacy, security and data protection news and analysis straight to your inbox