“I think it’s important to note what the use of facial recognition [in airports] means for American citizens,” Jeramie Scott, director of EPIC’s Domestic Surveillance Project, told BuzzFeed News in an interview. “It means the government, without consulting the public, a requirement by Congress, or consent from any individual, is using facial recognition to create a digital ID of millions of Americans.” – The US Government Will Be Scanning Your Face At 20 Top Airports, Documents Show
Facial recognition systems are headed to the airport, and it’s happening at a rapid pace, without public commentary and guardrails around data privacy or data quality.
Consider this news alongside of new reporting from ProPublica, that found TSA’s body scanning technology discriminates against black women by regularly flagging black women as security threats, resulting in increased screening for those women. Then add to that the reporting out from the New York Times last week called The Privacy Project. One of the most impactful reports took data from three cameras in Bryant Park and used it to create a facial recognition tracking software for less than $100. In the end they used it to identify one of the people in the park and it only took a few days work.
An AI dystopia in which bias is encoded into the algorithms and marginalized communities are further marginalized is hurtling towards us faster than the average person can keep up.
The result of increased use of facial recognition in public spaces puts our society on track to developing a system thats not entirely different from China’s social credit system. From the Buzzfeed article quoted above:
The big takeaway is that the broad surveillance of people in airports amounts to a kind of “individualized control of citizenry” — not unlike what’s already happening with the social credit scoring system in China. “There are already people who aren’t allowed on, say, a high-speed train because their social credit scores are too low,” he said, pointing out that China’s program is significantly based in “identifying individual people and tracking their movements in public spaces though automated facial recognition.”
It all reminded me of a tweet I saw this week which captures my frustration at American journalist’s continued reporting on China’s social credit system while ignoring our own American AI nightmare that’s headed full stem ahead:
*whispers* the us invests in mass surveillance and social credit systems the same way china does and yet some of us only ever point to china with outrage and it’s getting tiring— a once blue haired enby from oakland | tired of it (@WellsLucasSanto) April 16, 2019
Consider this: Just last month, landlords in NYC announced their interest in install ing facial recognition technology in rent subsidized apartments. Yet in Beijing, 47 public housing projects used the technology last year.
The use of facial recognition technology isn’t limited to the government . Companies are doing a bang up job already using facial recognition technology in unsuspecting places:
All of this makes me wonder: how do average people, those outside of tech, academia, and spheres of influence, push back against these technologies? Can you opt out of facial recognition tech at the airport? How do you know to opt out if you didn’t know it was being used to begin with? What happens when you opt out? Will you be subjected to more invasive searches? Will opting out delay your next flight? So many questions and sadly zero answers.