Beyond being abused, there are many ways for this technology to fail. Among the most pressing are misidentifications that can lead to false arrest and accusations. … Mistaken identity is more than an inconvenience and can lead to grave consequences.
Joy Buolamwini, “Coded Bias”: New Film Looks at Fight Against Racial Bias in Facial Recognition & AI Technology
tl;dr: Support students and alumni on March 2, a day of action against facial recognition on campus.
If facial recognition technology is being used on your campus, would you know about it? If it is being used, do you know how it’s impacting the communities you serve?
It’s easy to check out when you hear about facial recognition technology. The term still conjure up images of Minority Report or more recently, thoughts of China. Facial recognition use in every day life feels kind of far off if you’re not working in AI or security industry spaces.
But I’m asking you to check in. Because facial recognition technology is being deployed rapidly here in the US with little to no oversight into how it’s used or who it impacts. Facial recognition technology is being used in school districts, in churches, in restaurants, in grocery and retail stores, in job interviews, in the workplace, in public housing, on flights and cruises, and by police departments.
Facial recognition might also be coming to your campus. It could come as part of a security system for the football stadium. Or as a facial recognition-enabled security robot. Or as an attendance tracking tool for the classroom. Or as a tool to detect emotions during a class lecture.
Facial recognition has a negative impact on the communities you serve
In a widely cited 2018 study, MIT Media Lab researcher Joy Buolamwini found that three leading facial recognition tools — from Microsoft, IBM, and Chinese firm Megvii, were incorrect as much as a third of the time in identifying the gender of darker skinned women, as compared to having only a 1 percent error rate for white males.
How to avoid a dystopian future of facial recognition in law enforcement
Facial recognition technology is a surveillance tool designed to track and identify people, full stop. Advocates for facial recognition don’t talk about the technology in such blunt terms. They’ll talk about a tool that makes us safer or makes tasks more efficient. In university settings, they might talk about facial recognition as a tool to improve student outcomes.
What we hear less about is the impact facial recognition technology has on the populations it surveils. Facial recognition technology disproportionately impacts people of color, LGBTQ people, and gender noncomforming people. Whether it’s an algorithm’s inability to correctly identify black faces, a model that misgenders trans people, attempts at building systems that predict sexuality, facial recognition tools used by police, this technology causes harm to marginalized communities at scale.
A recent study found that facial recognition identified African-American and Asian faces incorrectly 10 to 100 times more than white faces.
“Coded Bias”: New Film Looks at Fight Against Racial Bias in Facial Recognition & AI Technology
The dangers of facial recognition technology to marginalized communities are very real. Mistaken identify could lead to arrest or worse. The inability to be identified by a facial recognition system could result in greater suspicion and even more surveillance. Algorithms built with binary gender classifications could lead to invasive searches (as is already happening to trans travelers) or people being outed against their will. Facial recognition tools that use algorithms to read emotions (and have no scientific basis) could penalize students for having the wrong expression and result in lower grades. Facial recognition tools also make it easier to stalk women.
Higher education leadership may not understand the impact of facial recognition on campus
Many leaders and decision makers who are exploring facial recognition tools for campus may not understand the negative impact of surveillance tools. With cutting edge technology, it’s easy to understand the marketing promises of safety and efficiency. It’s much harder to understand how bias is baked into algorithms, especially if you don’t have a background in data science or if you’re not part of a marginalized community.
Your leadership may not be considering how these tools impact the communities they serve. They likely won’t create policies around what happens when a surveillance tool misidentifies a person or try to develop an auditing system to ensure its accuracy. They may not think about opt out policies or ensure vendors don’t share or sell student bio-metic data.
They may not even publicize that they’re implementing the technology. It’s often introduced without transparency or an understanding of the impact of the population being tracked. In fact, at Michigan’s Oakland Community College, administrators tried to shut down a discussion about the implementation of facial recognition on campus:
OCC’s administrators have also blocked attempts by the student government to pass non-binding resolutions that would ban the use of facial recognition on campus.
It’s time to pay attention and push back
“The thing that worries me about how these tools are being rolled out right now is not that they’re being rolled out but that by and large, we’re only finding out that they’re being deployed after the fact and most of the people who are going to be affected by them are not being consulted in the process of designing and deploying them.”
How Penn State uses WiFi tracking and what students can do in an age of increasing data collection
Is your school thinking about using facial recognition technology? If you don’t know, it’s easy to find out.
Fight for the Future, a national leader in the efforts to ban facial recognition, just released a campus scorecard to identify which schools might use facial recognition on campus.
Take a look to see if your campus is open to facial recognition on campus.
If you see that your school might use facial recognition or if your school isn’t on the list, it’s time to get involved in efforts to push back.
Here’s how to learn more about facial recognition and advocate for those who will be impacted the most from this technology.
Understand the basics of facial recognition
You don’t need to be a machine learning engineer to understand facial recognition. The ACLU defines facial recognition as:
Facial recognition systems are built on computer programs that analyze images of human faces for the purpose of identifying them. Unlike many other biometric systems, facial recognition can be used for general surveillance in combination with public video cameras, and it can be used in a passive way that doesn’t require the knowledge, consent, or participation of the subject.
Here’s an easily accessible technical overview of how it works:
Understand the impact of surveillance tools
Joy Buolamwini is a leading researcher on facial recognition tools. She has several talks that cover this subject. I’m sharing one here but I recommend following her on Twitter and watching her other talks.
Next, watch this overview on the impact of facial recognition technology in communities of color in Detroit as well as the work being done to push back against facial recognition. It features Tawana Petty, the Director of the Data Justice Program at the Detroit Community Technology Project.
Then listen to how facial recognition tools affect nonbinary and trans people.
Finally, read up on the impact of how surveillance affects a population over all and why it’s important to prioritize data privacy for all:
8 Things You Need to Know about Surveillance
More recommended reading to round out your understanding:
- Flawed Face Data by Georgetown Law’s Center of Privacy and Technology
- AI Now Institute’s annual report (2019) covering the pushback against harmful AI (including facial recognition)
- The Data Justice work and community organizing at Detroit Community Technology Project
If podcasts are more your style, the Reset podcast has been reporting on facial recognition recently. Check out these episodes:
- Students Are Being Watched Closely (this is about high schools but imagine how you might feel if this were your campus)
- The Government Has Your Face
- A Startup, Billions of Images, and the End of Anonymity
Connect with students to understand their thoughts on facial recognition on campus
“We don’t really know what facial recognition technology could amount to, and we don’t know the psychological effects of that either,” Dewan says. “But America’s youth shouldn’t be the place to experiment on that, especially America’s youth who are going to try to educate themselves and expand their mind and learn things from multiple perspectives.”
‘This is a racial justice issue’: Students organize to stop facial recognition on campus
Students are starting to organize against facial recognition on campus.
Recently, students at UCLA fought back against plans to use facial recognition on campus:
The idea was to have the University of California Los Angeles use facial recognition as a way to gain access to buildings, to prove authenticity and to deny entry to people with restricted access to the campus, matching their faces against a database. Advocacy group Fight for the Future says UCLA was the first major university exploring using facial recognition to monitor students. The group had tested facial recognition software and found that “dozens” of student-athletes and professors were incorrectly matched with photos from a mug shot database, “and the overwhelming majority of those misidentified were people of color.”
UCLA drops controversial face recognition plan
So take time to see what students are thinking about on your campus. Talk to your student government, visit with student clubs, and ask your student newspaper about their thoughts on facial recognition on campus. Show your support for their work and ask how you can support them further.
Ask leadership about future plans to use facial recognition
If you aren’t sure if the leadership on your campus is planning to use facial recognition, it’s time to ask. If the answer is yes, ask follow up questions:
- Can students or staff opt out of facial recognition?
- Will students or staff be penalized from opting out?
- Who has access to biometric data collected on students?
- Has leadership consulted with anyone from communities of color on campus about efforts to implement facial recognition on campus?
- What are leadership’s plans to be transparent about the implementation of facial recognition?
- How is leadership listening to perspectives from communities affected most by surveillance tools?
- What happens if someone is incorrectly identified by the facial recognition system? What will happen to that person?
- What auditing procedures are in place to ensure facial recognition performs accurately?
- Will facial recognition data be shared with law enforcement agencies?
- What are other solutions to the problem that do not use facial recognition technology? Why aren’t they being considered?
Ask follow up questions that push towards transparency, include diverse perspectives, and ultimately push back on the use of facial recognition on campus.
Join the fight: National Day of Action on March 2
Dozens of national civil liberty organizations have also called for administrators to ban the technology on the grounds that it would reduce the safety of students of color, expose biometric data to potential hackers, and create an environment undermining academic freedom and personal expression.
This College Banned Students From Even Discussing Facial Recognition
Fight for the Future is leading a national day of action in collaboration with student groups to stop facial recognition on campus. Be part of the movement to ban facial recognition on campus and show solidarity with students.
Get more information here. You can also help coordinate a day of action on your campus by emailing Fight for the Future at team@fightforthefuture.org