Eleanor J. Bader, Truthout PUBLISHEDJune 30, 2019
Sports enthusiasts heading to Tokyo for the 2020 Summer Olympics beware: Japan intends to install hundreds of thousands of facial recognition cameras to identify everyone in attendance. The software, initially used when Tokyo hosted the Paralympic games in 2018, is meant to weed out, in real time, people suspected of being potential terrorists — and anyone with a criminal record or questionable immigration status.
Japan is hardly alone in using this technology. School kids in Sweden are photographed to verify attendance. China, meanwhile, already has 176 million facial surveillance cameras and expects to have installed a total of 450 million by next year. What’s more, many department stores throughout the U.S. rely on facial recognition cameras to tag — and sometimes ban — people convicted of shoplifting. Supermarkets wishing to verify that shoppers are old enough to buy cigarettes and beer also utilize them.
Yes, it sounds like a 21st century incarnation of Big Brother, but facial recognition software has been in development since the 1960s and is now ubiquitous. The technology maps an individual’s facial features using mathematical algorithms. Once stored, these images, called “faceprints,” can be used to verify a person’s identity. Photos can be taken using drones as well as more conventional cameras, from a distance of about 50 feet — something that civil libertarians charge can have a chilling impact on public protest, assembly and speech since the folks being filmed typically have no idea that this is happening.
The footage can also result in misidentification and, worse, false arrests. Indeed, opponents of biometrics charge that people of color are often incorrectly identified. For example, one study using Amazon’s Rekognition software resulted in 28 African American members of Congress, including Rep. John Lewis (D-Georgia), being mistaken for crime suspects — an error rate of 39 percent. Likewise, researchers from the Massachusetts Institute of Technology and the University of Toronto found that dark-skinned women were often erroneously identified as men.
Business Interests Prevail
This is business, however — big business. Take FaceFirst, a leader in retail surveillance. The company claims that it can match a face against a database of 25 million people in just under one second, a boon to stores eager to nab the shoplifters who cost them nearly $50 billion in 2016 alone.
But where did they get photos of 25 million people? Good question.
One answer comes from EverAlbum.com, a Silicon Valley company that has sold millions of photos to the U.S. military and law enforcement agencies across the country. The pictures had been culled from unsuspecting people who’d paid EverAlbum to create photobooks for them — something that was billed as a way to share and commemorate meaningful family moments. None had consented to the sharing of their materials.
Not surprisingly, EverAlbum’s marketing sounds completely benign: “Ever is a company dedicated to helping you capture and rediscover your life’s memories,” the company’s website boasts. “Ever automatically creates books out of your memories, arranging them in beautiful, professionally designed layouts. It’s a snap to edit and order.” Business Insider was so enthusiastic about the company that it named it one of the top start-ups of 2016.
The 49-person company has an estimated annual revenue of $5.4 million. It has also raised $29 million from two primary investors since 2016. Financial transparency data from Sambla reveals that Vigilant Solutions reportedly sold millions of license plate scans to Immigration and Customs Enforcement.
Another U.S. company, Vigilant Solutions, owned by Thomson Reuters, reportedly sold millions of license plate scans to Immigration and Customs Enforcement — without permission from the drivers whose tags were included — cashing in for $6.1 million in 2018. All told, reports indicate that Vigilant has the capacity to collect between 150 and 200 million license scans per month.
Even more insidious, facial and touch recognition are marketed as security features by some cellphone manufacturers, which offer them as options for unlocking one’s portable device. (For example, the iPhone X can unlock via facial recognition.) Meanwhile, ads that highlight facial recognition’s ability to find a child or elder who wanders off tug at our emotions, since no one wants their loved ones to find themselves in harm’s way.
But the promise that, with the touch of a screen or a photo of a face, we can all live lives of absolute inviolate security comes with strings attached — strings that may compromise our civil rights and privacy. These concerns have prompted cities throughout the country to begin asking whether so-called safety and security should override human rights and the expectation of privacy.
Passing Local Restrictions
California activists have led the way in challenging facial recognition technology.
Tracy Rosenberg is executive director of the Media Alliance, an organization that aims to keep media accessible and accountable. She’s also an active member of Oakland Privacy, a coalition that monitors surveillance activities in that city. Rosenberg told Truthout that efforts to curtail surveillance in the Bay Area go back to 2013 and were an outgrowth of Occupy. Activists, she reports, mobilized when they learned that the Department of Homeland Security had funded the Domain Awareness Center (DAC) — a security measure meant to oversee activity at the Port of Oakland — but had quickly expanded its surveillance far beyond the port.
“The community got fairly upset when it learned how broad the surveillance of our communities was and they started to organize,” Rosenberg says. “By 2014, we got the surveillance scaled back to the port alone, but by 2015, the DAC basically died altogether. After this victory, we decided we needed to be more proactive and less reactive, and developed a transparency-based protocol to require oversight of all surveillance technology.”
As the activists gathered together, they discovered that the issue was even more serious than they’d thought: Private companies, like Vigilant, were feeding information to law enforcement.
Five years later, San Francisco Board of Supervisors member Aaron Peskin was instrumental in getting a Privacy First measure on the City’s 2018 ballot, seeking to incorporate privacy principles into the city charter. Less than a year after that, he introduced the nation’s first outright ban on the use of facial recognition software by law enforcement and other city personnel.
Lee Hepner, a legislative aide to Peskin, says that his boss was moved to push for the Privacy First ballot measure after his office discovered a different type of privacy encroachment. “We learned that after someone unlocks a scooter on the street, the company is able to get that person’s credit report and then share the information it reveals with other agencies. This was done without the user’s knowledge,” Hepner says. “No one knew that when they went to unlock a scooter, they were unwittingly giving their info to a host of businesses. This showed us how broad the issue of privacy actually was. We had to act.”
Privacy First passed with more than 50 percent of the vote, Hepner reports.Privacy First passed with more than 50 percent of the vote.
San Francisco is not alone. By the time the measure became law, several other California cities, including Berkeley, Oakland, Palo Alto and Santa Clara (as well as Nashville, Tennessee; Seattle, Washington; and Cambridge, Lawrence and Somerville, Massachusetts), had passed the Stop Secret Surveillance Ordinance, a demand for transparency about how and when technology can be used and data shared.
“San Francisco did this in May, but is the only city to date that has added a prohibition on government agencies using facial recognition technology,” Hepner says. “Our bill says that facial recognition software represents a line that government agencies cannot cross because the harms outweigh the benefits.”
Although San Francisco’s private citizens and businesses are not restricted from using facial recognition technology, the more than 50 San Francisco agencies governed by the regulation have until September 2019 to notify the city’s Board of Supervisors about the surveillance they use.
“We really don’t know the extent to which surveillance tech is being used by city departments,” Hepner says. “For example, we just learned that our homeless shelter system uses fingerprint biometrics. This can be a deterrent to people entering the shelters because people with an outstanding warrant may be afraid of repercussions if they enter a shelter building.”
Passing Surveillance Controls
The Electronic Frontier Foundation (EFF), a 29-year-old digital rights group, notes that there are two primary ways that communities grant protections from surveillance to residents: passage of Community Control of Police Surveillance (CCOPS) and Surveillance Equipment Regulation Ordinances (SERO). Both are meant to put a brake on monitoring.
“In some places, state laws preempt city councils from controlling police departments in that state,” Nathan Sheard, EFF’s grassroots advocacy organizer, explains, “but in places where CCOPS and SERO have passed, police agencies have to submit a use policy to a board of supervisors or city council.”Even if everyone is equally surveilled and impacted, the negatives of surveillance won’t disappear.
These bodies are then charged with conducting an analysis of surveillance and assessing its impact, especially when it comes to historically marginalized and already hyper-policed populations. They must also give community members time to review the proposal and allow public comment. After these steps are completed, the governing body decides whether law enforcement can acquire the requested surveillance equipment or continue using what it already owns.
Sheard says this process is a key transparency measure — but it doesn’t solve the whole problem. Even if everyone is equally surveilled and impacted (an unlikely prospect), the negatives of surveillance won’t disappear.
“The companies that create surveillance software are working to address the racial and gender disparities that exist, but even if this stuff had a 100 percent accuracy rate, would progressives want to support it?” Sheard asks. “I think the answer is no.”
Victim of Misidentification of Apple Sues Apple
Ousmane Bah, a 19-year-old college student from the Bronx, knows firsthand the trauma that misidentification can cause. Last year, he was mistakenly identified as the man who stole numerous Apple pencils — tools selling for between $99 and $125 a piece and used with the iPad Pro — from stores in Connecticut, Massachusetts, New Jersey and New York.
Bah believes that he landed on Apple’s radar after someone used his driving learner’s permit — which Bah says he had lost — as identification after he was collared for theft. Somehow, the name Ousmane Bah was then linked to the photographic footage taken in the Apple stores of the actual thief or, perhaps, thieves.“You have a right to expect that your life will not become an open book when you have not done, and are not suspected of doing, anything wrong.”
Although most of the criminal charges against Bah were dismissed when police determined that the surveillance photos were of a completely different person, attorney Subhan Tariq says that Bah is suing Apple — a company that is worth more than 1 percent of the world’s gross domestic product — for $1 billion in damages since he continues to suffer from debilitating anxiety as a result of the arrest. “Punitive damages are supposed to serve as a deterrent,” Tariq told Truthout. “They are supposed to prevent the same type of conduct from recurring in the future.”
Apple has filed a motion to dismiss the lawsuit and a hearing will take place later this month.
Meanwhile activists are continuing to mobilize against facial recognition and other biometric surveillance. As the ACLU of Southern California sayson their webpage, “You have a right to expect that your life will not become an open book when you have not done, and are not suspected of doing, anything wrong.”