Ice Lounge Media

Ice Lounge Media

The walls are closing in on Clearview AI

IceLoungeMedia IceLoungeMedia

Controversial facial recognition company Clearview AI has been fined almost $10 million by the UK’s data protection watchdog for collecting the faces of UK citizens from the web and social media. The firm was also ordered to delete all of the data it holds on UK citizens.

The move by the UK’s Information Commissioner’s Office (ICO) is the latest in a string of high-profile fines against the company as data protection authorities around the world eye tougher restrictions on its practices.

The ICO found that Clearview AI had been in breach of data protection laws, collected personal data without people’s consent, and asked for additional information, such as photos, when people asked if they were in the database. It found that this may have “acted as a disincentive” for people who objected to their data being scraped. 

“The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable,” said John Edwards, the UK’s information commissioner, in a statement. 

Clearview AI boasts one of the world’s largest databases of people’s faces, with 20 billion images that it has scraped off the internet from publicly available sources, such as social media, without their consent. Clients such as police departments pay for access to the database to look for matches.

Data protection authorities around the Western world have found this to be a clear violation of privacy and are now beginning to work together to clamp down. Edwards stressed that “international cooperation is essential to protect people’s privacy rights in 2022” and is due to meet with European regulators in Brussels this week. The UK’s investigation into Clearview AI was carried out jointly with the Australian information commissioner.

Earlier this year, Italian data protection authorities fined Clearview AI €20 million ($21 million) for breaching data protection rules. Authorities in Australia, Canada, France, and Germany have reached similar conclusions. 

Even in the US, which does not have a federal data protection law, Clearview AI is facing increasing scrutiny. Earlier this month the ACLU won a major settlement that restricts Clearview from selling its database across the US to most businesses. In the state of Illinois, which has a law on biometric data, Clearview AI cannot sell access to its database to anyone, even the police, for five years. 

According to Silkie Carlo, who is the director of UK-based digital rights group Big Brother Watch, the ICO’s decision “effectively stops Clearview from operating in the UK,” she said on Twitter.  

Carlo added that the decision “should be a nail in the coffin for facial recognition” and called for UK lawmakers to ban facial recognition surveillance. 

Europe is working on an AI law that could ban the use of “real-time” remote biometric identification systems, such as facial recognition, in public places. The current drafting of the text restricts the use of facial recognition by law enforcement unless it is to fight serious crimes, such as terrorism or kidnappings. 

There is a possibility that the EU will go further. The EU’s influential data protection watchdogs have called for the bill to ban not only remote biometric identification in public, but the police use of web-scraped databases, such as Clearview AI’s. 

“Clearview AI is fast becoming so toxic that no credible law enforcement agency or public authority or other company will want to work with them,” says Ella Jakubowska, who works on facial recognition and biometrics for European Digital Rights, a digital rights group. 

Hoan Ton-That, Clearview AI’s CEO, said he is disappointed the ICO has “misinterpreted my technology and intentions.” 

“We collect only public data from the open internet and comply with all standards of privacy and law,” he said in a statement that was sent to MIT Technology Review. 

“I would welcome the opportunity to engage in conversation with leaders and lawmakers so the true value of this technology which has proven so essential to law enforcement can continue to make communities safe,” he added.