Your face is not private. It never was — but for most of human history, strangers could not look you up in a database. That changed when a company called Clearview AI decided to scrape the entire internet. Clearview AI: The Company That Stole a Billion Faces In January 2020, Kashmir Hill reported in The New York Times that a previously unknown startup called Clearview AI had built a facial recognition database of more than 3 billion images scraped from Facebook, YouTube, Venmo, and countless other websites — without the knowledge or consent of anyone in those photos. By 2024, that number had grown to 50 billion images, according to the company's own claims. The tool allows users to upload a photo of any face and receive matches from the database within seconds. Clearview's client list included over 3,100 law enforcement agencies, federal bodies including ICE and the FBI, and — according to leaked documents — private entities ranging from the NBA to the nation of Saudi Arabia. Facebook, Google, and YouTube all sent cease-and-desist letters. Clearview ignored them. European regulators fined the company over $33 million. Clearview called the fines "unlawful" and "unenforceable." The scraping continued. The Wrongful Arrests Facial recognition is not a neutral technology. It makes mistakes. And those mistakes have consequences. The ACLU has documented at least 13 wrongful arrests caused by police reliance on facial recognition technology. The pattern is consistent: Robert Williams (Detroit, 2020): Arrested at his home in front of his wife and children for a watch theft he did not commit. Detained for 30 hours. The facial recognition match was based on grainy surveillance footage. Detroit Police later settled for $300,000. Michael Oliver (Detroit, 2019): Wrongfully arrested and charged. The algorithm confused him with someone else entirely. Nijeer Parks (New Jersey, 2019): Spent 11 days in jail and paid $5,000 in legal fees after a false facial recognition match tied him to a shoplifting incident. The real suspect was 15 years younger. Porcha Woodruff (Detroit, 2023): Eight months pregnant when police arrested her for robbery and carjacking based on a facial recognition match. She experienced contractions in custody and required hospitalization. Nearly every documented victim of a facial recognition wrongful arrest is Black. The Racial Bias The Gender Shades study, published in 2018 by Joy Buolamwini and Timnit Gebru at the MIT Media Lab, tested three commercial facial analysis systems from Microsoft, IBM, and Face++. The results were unambiguous: Darker-skinned females were misidentified up to 34.7% more often than lighter-skinned males All three systems showed the highest error rates for darker-skinned women and the lowest for lighter-skinned men Microsoft's system misidentified darker-skinned females as male 20.8% of the time The root cause is training data. Facial recognition systems are trained on datasets that disproportionately include lighter-skinned faces. When these systems are deployed against the general population, the people who look least like the training data — disproportionately Black and Brown people — suffer the consequences. Where Facial Recognition Is Used The technology has spread far beyond law enforcement: Retail stores: Major retailers including Macy's and Lowe's have tested facial recognition for shoplifting prevention. Customers are scanned without notification. Airports: The TSA and Customs and Border Protection use facial recognition at airports. CBP has stated a goal of scanning 97% of departing passengers by 2026. Schools: Multiple school districts have deployed facial recognition for "security," including Lockport City School District in New York, which spent $3.8 million on a system that had a 5% false positive rate — meaning hundreds of students could be misidentified daily. Housing: Some landlords have attempted to use facial recognition for building access, prompting legal challenges. Concerts and sports venues: Madison Square Garden used facial recognition to identify and eject lawyers involved in litigation against the venue's owner. The Regulatory Vacuum There is no federal law regulating facial recognition in the United States. No statute limits what companies can collect, how long they can store it, or who they can sell it to. No federal standard requires accuracy testing before deployment. A handful of cities have acted on their own: San Francisco (2019): First U.S. city to ban government use of facial recognition Oakland, Berkeley, Somerville, Cambridge, and Portland followed with similar bans Virginia and Vermont enacted state-level restrictions on law enforcement use But these local bans cover a tiny fraction of the population. In most of the country, police can use facial recognition on anyone, at any time, with no judicial oversight. Commercial use is essentially unregulated everywhere. How to Opt Out (You Largely Can't) There is no comprehensive way to remove your face from facial recognition databases. Some limited options exist: Clearview AI offers an opt-out process that requires submitting a government ID and a photo — effectively giving them more data to verify your identity Some states, including Illinois (under BIPA) and Texas, have biometric privacy laws that allow individuals to sue over unauthorized collection The EU's GDPR provides stronger protections for European residents, including the right to deletion For most Americans, there is no opt-out. Your face was scraped, stored, and searchable before you ever heard the name Clearview AI. The Future Being Built Without Your Input Facial recognition is becoming more accurate, more pervasive, and harder to avoid. Every new camera installed in a store, school, or street corner potentially feeds into a recognition system. The technology improves with every image added to the training data — and every image of you on the internet has already been collected. The U.S. Commission on Civil Rights concluded in its September 2024 report that federal use of facial recognition poses serious civil rights risks and recommended immediate congressional action. Congress has taken no action. They did not ask if we wanted to know. They mapped our faces first and asked questions never. _- The Department_