Information Commissioner’s Office fines facial recognition company Clearview AI more 7,552,800 pounds and orders data be deleted
May 24, 2022 |
The UK Information Commissioner has imposed a significant fine of £7,552,800 on Clearview AI for illegally collecting personal data of UK residents. The facial images of UK residents were scraped from the internet and fed into Clearview’s database where, with the aide of artificial intelligence, it could use that data to identify those people and monitor them.
Clearview AI continues to maintain that it has done nothing wrong, saying that its technology and intentions have been “misinterpreted.” and claimed that Clearview AI is not subject to the ICO’s jurisdiction.
Clearview has already been the subject of act ion by other regulators. In March 2022 the Italian data protection agency fined Clearview €20 million penalty for breaches of EU law. In December last year France’s data watchdog, CNIL,found that Clearview had committed two breaches of the the GDPR. Similarly in February 2021 Canadian privacy commissioners stated that Clearview violated Canadian Privacy laws . In the United States Cook County, effectively Chicago, and Clearview entered into agreement in settlement of a suit whereby Clearview has agreed to stop providing its technology to most private clients and doing business in Illinois.
The use of facial recognition technology by police, is belatedly being scrutinised by regulators. In April 2021 the Swedish Privacy Protection Authority fined Swedish police 2.5 million kroner for violating the Crime Data Act (Brottsdatalagen) by using the facial recognition application Clearview AI.
The ICO media release provides an excellent overview of the facts and the action taken. It relevantly provides:
The Information Commissioner’s Office (ICO) has fined Clearview AI Inc £7,552,800 for using images of people in the UK, and elsewhere, that were collected from the web and social media to create a global online database that could be used for facial recognition.
The ICO has also issued an enforcement notice, ordering the company to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.
The ICO enforcement action comes after a joint investigation with the Office of the Australian Information Commissioner (OAIC), which focused on Clearview AI Inc’s use of people’s images, data scraping from the internet and the use of biometric data for facial recognition.
What did Clearview AI Inc do?
Clearview AI Inc has collected more than 20 billion images of people’s faces and data from publicly available information on the internet and social media platforms all over the world to create an online database. People were not informed that their images were being collected or used in this way.
The company provides a service that allows customers, including the police, to upload an image of a person to the company’s app, which is then checked for a match against all the images in the database.
The app then provides a list of images that have similar characteristics with the photo provided by the customer, with a link to the websites from where those images came from.
Given the high number of UK internet and social media users, Clearview AI Inc’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge.
Although Clearview AI Inc no longer offers its services to UK organisations, the company has customers in other countries, so the company is still using personal data of UK residents.
John Edwards, UK Information Commissioner, said:
“Clearview AI Inc has collected multiple images of people all over the world, including in the UK, from a variety of websites and social media platforms, creating a database with more than 20 billion images. The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.
“People expect that their personal information will be respected, regardless of where in the world their data is being used. That is why global companies need international enforcement. Working with colleagues around the world helped us take this action and protect people from such intrusive activity.
“This international cooperation is essential to protect people’s privacy rights in 2022. That means working with regulators in other countries, as we did in this case with our Australian colleagues. And it means working with regulators in Europe, which is why I am meeting them in Brussels this week so we can collaborate to tackle global privacy harms.”
Details of the contraventions
The ICO found that Clearview AI Inc breached UK data protection laws by:
-
- failing to use the information of people in the UK in a way that is fair and transparent, given that individuals are not made aware or would not reasonably expect their personal data to be used in this way;
- failing to have a lawful reason for collecting people’s information;
- failing to have a process in place to stop the data being retained indefinitely;
- failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR);
- asking for additional personal information, including photos, when asked by members of the public if they are on their database. This may have acted as a disincentive to individuals who wish to object to their data being collected and used.
The fine announcement was covered in Techcrunch with UK fines Clearview just under $10M for privacy breaches, the Guardian with UK watchdog fines facial recognition firm £7.5m over image collection, the verge with Clearview AI ordered to delete facial recognition data belonging to UK residents, the Times with Online facial images were ‘harvested’ and the Register with Clearview AI fined millions in the UK: No ‘lawful reason’ to collect Brits’ images. And there are other stories.