UK fines facial recognition firm Clearview AI more than £7.5m and orders it to delete data

Clearview has recently been used by Ukraine to identify deceased soldiers

The UK has fined facial recognition firm Clearview AI £7.5m for collecting people's data.  (AP Photo / Seth Wenig, File)
Beta V.1.0 - Powered by automated translation

The UK's information watchdog has fined facial recognition database company Clearview AI Inc more than £7.5m and ordered it to delete its data.

The Information Commissioner’s Office (ICO) fined the firm £7,552,800 on Monday for using images of people in the UK, and elsewhere, that were collected from the web and social media to create a global online database that could be used for facial recognition.

The ICO has also issued an enforcement notice, ordering the company to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.

Recently the firm had been used by Ukraine to identify Russian soldiers who have been killed in combat so that their families could be informed.

Ukraine's Ministry of Defence began using technology from the New York-based facial recognition provider in March.

The ICO enforcement action comes after a joint investigation with the Office of the Australian Information Commissioner (OAIC), which focused on Clearview AI Inc’s use of people’s images, data scraping from the internet and the use of biometric data for facial recognition.

Clearview AI Inc has collected more than 20 billion images of people’s faces and data from publicly available information on the internet and social media platforms all over the world to create an online database. People were not informed that their images were being collected or used in this way.

The company provides a service that allows customers, including the police, to upload an image of a person to the company’s app, which is then checked for a match against all the images in the database.

The app then provides a list of images that have similar characteristics with the photo provided by the customer, with a link to the websites from where those images came from.

Given the high number of UK internet and social media users, Clearview AI Inc’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge.

Although Clearview AI Inc no longer offers its services to UK organisations, the company has customers in other countries and is still using personal data of UK residents.

“Clearview AI Inc has collected multiple images of people all over the world, including in the UK, from a variety of websites and social media platforms, creating a database with more than 20 billion images," John Edwards, UK Information Commissioner, said.

"The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.

“People expect that their personal information will be respected, regardless of where in the world their data is being used. That is why global companies need international enforcement. Working with colleagues around the world helped us take this action and protect people from such intrusive activity.

“This international cooperation is essential to protect people’s privacy rights in 2022. That means working with regulators in other countries, as we did in this case with our Australian colleagues. And it means working with regulators in Europe, which is why I am meeting them in Brussels this week so we can collaborate to tackle global privacy harms.”

The ICO found that Clearview AI Inc breached UK data protection laws by failing to use the information of people in the UK in a way that is fair and transparent, failing to have a lawful reason for collecting people’s information and failing to have a process in place to stop the data being retained indefinitely.

It also found that the firm had asked people for further personal information, including photos, when they had contacted it to see if they were on it.

Updated: May 23, 2022, 4:26 PM