facial recognition, biometrics, surveillance
hanson-lu-WrNbw7UeNqI-unsplash

Clearview AI agrees to block US commercial access to its facial recognition database

Clearview AI has agreed to a court settlement that will ban the biometrics company from selling its huge database of faceprints to private business or individuals anywhere in the U.S., a move that the American Civil Liberties Union is calling a big victory for privacy advocates.

The ACLU announced the settlement with the company Monday. The advocacy group and several other plaintiffs had sued Clearview AI in May 2020, accusing it of violating the Illinois Biometric Information Privacy Act (BIPA), a privacy law that limits the use of individuals' face scans and other identifying data.

The settlement in Illinois state court also blocks sales to government agencies — including law enforcement — within the state for five years, but Clearview AI can still sell to all levels of government elsewhere. Earlier this year, CEO Hoan Ton-That said the company was going to focus on marketing services to federal agencies in 2022.

Clearview AI's products have alarmed privacy and security advocates for years, given that the company scrapes publicly-available web pages — like social media profiles and photo-sharing accounts — for data it can feed to its algorithms. There are also concerns about racial biases within the algorithms. Governments in Canada, Australia and elsewhere have taken actions to curtail its use.

The facial recognition database has more than 10 billion faceprints, the ACLU said.

“Clearview can no longer treat people’s unique biometric identifiers as an unrestricted source of profit," said Nathan Freed Wessler, a deputy director of the ACLU Speech, Privacy, and Technology Project. "Other companies would be wise to take note, and other states should follow Illinois’ lead in enacting strong biometric privacy laws.” BIPA was enacted in 2008, but only a few other states have written similar laws since.

Floyd Abrams, a renowned First Amendment lawyer hired by Clearview AI to defend against the suit, told The New York Times that the company was “pleased to put this litigation behind it.”

The lawsuit argued that the technology presents big security risks to vulnerable communities of people, including "survivors of domestic violence and sexual assault, undocumented immigrants, current and former sex workers" and others "uniquely harmed by face recognition surveillance," the ACLU said. Groups representing those populations in Illinois were part of the lawsuit.

The company was in the news in early April for donating its services to government agencies in Ukraine as that country defended itself against Russia's military invasion. Among the uses were identifying dead Russian soldiers through face scans.

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles

Joe Warminsky

Joe Warminsky

is the news editor for Recorded Future News. He has more than 25 years experience as an editor and writer in the Washington, D.C., area. Most recently he helped lead CyberScoop for more than five years. Prior to that, he was a digital editor at WAMU 88.5, the NPR affiliate in Washington, and he spent more than a decade editing coverage of Congress for CQ Roll Call.