May 9, 2022. Facial recognition company Clearview AI has agreed to stop its sales to private companies in the United States as part of a landmark settlement reining in a technology criticized as threatening Americans’ privacy rights.
The settlement, filed Monday in federal court in Illinois, marks the most significant court action yet against Clearview AI, a company known for downloading billions of people’s photos from social networks and other websites to build a face-search database sold to law enforcement.
This highlights how a single state privacy law can have nationwide ramifications for Americans’ civil rights protections. The lawsuit, filed by the American Civil Liberties Union in 2020, accused Clearview of violating an Illinois law banning companies from sharing people’s face photos, fingerprints and other biometric information without their consent.
Clearview, based in New York, has argued in court that the Illinois law restricted the company’s ability to collect and analyze public information — and, therefore, violated its First Amendment-protected freedom of speech.
Clearview chief executive Hoan Ton-That said in a statement that the company intends to sell its facial recognition algorithm to commercial customers, without its database of faces, in a “consent-based manner.”
Many such algorithms are already offered for sale and require clients to connect their own databases, such as when a company wants only its employees to be able to unlock the face-scanning systems for secure doors. Clearview, however, has long promoted its database of faces as a distinctive feature, and the settlement could greatly limit its future prospects.
The Illinois law, adopted in 2008, has led to several major tech-privacy settlements, including a $650 million settlement from Facebook related to its facial recognition use.
The United States has no federal facial recognition law, even though the technology has been used by thousands of local, state and federal law enforcement agencies, including to charge Americans with crimes.
The United States “lacks a comprehensive privacy law, even one protecting these most sensitive, most immutable identifiers,” like people’s faces, Wessler said. “Congress should act — and, as long as they’re not able to, more states should take up the mantle.”
As part of the settlement, which will become final when approved by the court, Clearview has agreed to stop selling or offering free access to its facial recognition database to most businesses and other private entities nationwide.
The company also agreed to stop working with all police or government agencies in Illinois for five years, and to continue trying to filter out photos that were taken in or uploaded from the state.
Clearview has created an opt-out form that Illinois residents can use to request that their photos not show up in its search results. The company said it would spend $50,000 to pay for online ads publicizing the form. The company offers a similar request form for California residents covered by the state’s Consumer Privacy Act.
Clearview said it also would stop offering free trial accounts to police officers without their supervisors’ approval. Those accounts had allowed individual officers to run searches outside their agencies’ investigative protocols and chain of command and become what Wessler said was a “real recipe for abuse.”
The Government Accountability Office, a federal watchdog, last year said 13 federal agencies did not know what facial recognition systems its own employees were using, meaning that the agencies had “therefore not fully assessed” the systems’ potential privacy and accuracy risks.
The ACLU sued Clearview on behalf of groups representing immigrants, sex workers and survivors of domestic violence, arguing that they faced extraordinary harms from the police identification tool.
The Biometric Information Privacy Act in Illinois offers the strictest protections in the country for people’s sensitive health information, and no other state has passed a similar law. The Health Insurance Portability and Accountability Act, or HIPAA, restricts how hospitals and other “covered entities” trade people’s health-care information, but it does not cover the sharing of user data by tech companies.
Facebook agreed to pay $650 million in 2020 to settle a class-action lawsuit charging it with violating the Illinois law and, last year, said it would stop using its widely known facial recognition software and delete the facial data of more than a billion people, citing “growing concerns about the use of this technology as a whole.”
The settlement comes at a time when Clearview has been racing to woo investors and raise tens of millions of dollars to expand its business around the world. In an investor presentation from December first reported by The Washington Post, the company said it hoped to boost its sales to private companies in financial services, real estate, the “gig economy” and other industries, and that it was working to expand its facial database to 100 billion photos so that “almost everyone in the world will be identifiable.”
Clearview had for months offered its search tool to stores and other private companies, but it has since restricted it to police and government use with formal approval. The company has proposed spinning off other facial recognition products for private companies, including for the kinds of identity-verification systems used to unlock doors or access bank accounts, and said that tool would not intersect with its main law enforcement database.
Clearview’s database now includes more than 20 billion photos taken from across the Internet, and its search tool allows users to submit a photo and get links to the photo’s originating website or social media account.
The system has been used by police in the United States to identify protesters and criminals, including rioters at the U.S. Capitol on Jan. 6, 2021. In Ukraine, officials have used it to identify dead Russian soldiers, find their social media accounts and contact their families.
Facebook, Google and other major tech companies have sent legal orders demanding that Clearview delete any images downloaded from their servers, but Clearview has refused. “I don’t think we want to live in a world where any big tech company can send a cease and desist, and then control, you know, the public square,” Ton-That, the company’s CEO, told The Post in a live interview last month.
“Clearview has built a product no other company has been willing to build, because of its dangerousness, and this settlement vindicates the decision” of Google, Amazon and other companies to shelve or end their plans to sell facial recognition systems for use by companies or police, Wessler said. “Other companies should take note. Violating people’s information privacy rights is not costless. They will eventually be held to account at great financial and reputational costs.”