
What If Our Clothes Could Disrupt Surveillance Cameras?
Episode 12 | 9m 23sVideo has Closed Captions
As mass surveillance reaches an all time high, some fashion designers are taking a stand.
What do you get when you combine mass surveillance with A.I.? It’s a dystopia that’s already a reality in places across the world. Fashion designers are pushing back, crafting clothing and accessories that trick facial recognition software into mislabelling a person as something else, like…a giraffe or a zebra. How can you escape constant surveillance? Sinéad Bovell takes a closer look.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Funding for FAR OUT is provided by the National Science Foundation.

What If Our Clothes Could Disrupt Surveillance Cameras?
Episode 12 | 9m 23sVideo has Closed Captions
What do you get when you combine mass surveillance with A.I.? It’s a dystopia that’s already a reality in places across the world. Fashion designers are pushing back, crafting clothing and accessories that trick facial recognition software into mislabelling a person as something else, like…a giraffe or a zebra. How can you escape constant surveillance? Sinéad Bovell takes a closer look.
Problems playing video? | Closed Captioning Feedback
How to Watch Far Out
Far Out is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- Imagine you're on a sidewalk and you wanna cross the street but the crosswalk is just a little too far away.
So you look left, you look right and you dart across the road.
A week later, you get a ticket in the mail for jaywalking.
The camera on the streetlight above you was equipped with facial recognition software and law enforcement matched your face with your license photo and found your address.
A few days later, you walk into a concert and security flags your expression as menacing or dangerous.
You're escorted out.
It's a dystopian future that's actually already a reality and places across the world.
So here's the question.
Is constant mass surveillance like this inevitable here in the U.S.?
- My fear is that if we don't win in this moment where the technology is still new enough to be shocking, where the outrageous and abuses still are salient and it hasn't yet become completely normalized as a facet of life, I worry that if we don't win this fight now that it will just become accepted the way a lot of problematic invasions of our privacy already have.
- I'm Sinead Bovell, and this is Far Out.
[upbeat music] Most of us recognize and even shrug at the fact that every move we make online is being tracked.
But what about the real world?
Whatever that means.
How much do we really know about how we're being tracked in public?
Here in the U.S. the number of surveillance cameras grew nearly 50% between the years of 2015 and 2018, ballooning from 47 million to 70 million, those cameras can all be equipped with facial recognition systems.
And in the last decade, these systems have gotten way better and cheaper, mostly because of, yep, you guessed it, AI.
- We see CTTV everywhere.
We see both government owned systems and private systems.
But increasingly there isn't much of a distinction between the two.
As you know, local police departments federal agencies and everything in between will contract with individual companies and service providers to transform those private surveillance systems into a public policing tool.
It's an opportunity to add all of these layers of surveillance software on top of that building an ever more intricate and invasive web of tracking that can at this point look at nearly every moment of how we spend the day.
- So you can't exactly avoid surveillance technology unless you never leave the house but you can wear something like this or this or this.
These are all examples of what's called adversarial fashion, clothing and accessories designed to trick facial recognition software into mislabeling a person as something else like a giraffe or a zebra.
Activists sometimes use umbrellas or all black clothing to remain anonymous during a protest.
But these designs specifically target the software that powers facial recognition technology.
But to understand how adversarial fashion works we first need to understand how facial recognition works.
The way humans see is actually a decent metaphor for how machines see.
You see someone, your brain interprets their features and then you recall their name from your memory.
Machines do something similar.
The camera captures an image.
That image is fed into an artificial neural network and the image passes through layers in that network that separate features like edges, textures, and shapes.
We've trained neural networks to find patterns by giving them massive data sets with labeled images.
And this is creepy.
It's likely all of our faces in all of those data sets.
The ethics of which could be an episode all on its own.
Yes, academics might train their systems using publicly available images, but private companies are scraping the data we've all put on social media.
The network compares all that data to the original photo and comes up with a match.
And that's a very basic explanation of how machine learning works.
And scientists actually don't understand the finer details.
For example, understanding why the algorithms that power these systems make certain choices.
Like how did it know that this nose matches that nose?
That's where our knowledge ends.
We don't know how the algorithm knows about noses.
We just gave them this massive data set and a goal, match these faces, how they do it is still a mystery.
Neural networks were inspired by the human brain, layers of neural networks working together to transform inputs into information we can use.
But at this point in human history, both human brains and artificial brains are still mystifying to us.
That uncertainty is very unsettling especially when AI is used to conduct surveillance which brings us all the way back to adversarial fashion.
Adversarial clothing confuses the part of the neural network that categorizes objects.
So these exaggerated ears and noses make it harder for the machine to say with confidence, "Yep, that's a human."
Researchers have also designed attacks on neural networks that make photos unrecognizable.
This one looks like static to us, but when you layer it over a panda, the system tags it as a gibbon, which is a type of small ape, if like me, you didn't know but it's not easy to make these images.
You often need access to the neural network itself in order to design and attack on it.
And they may not work on every system.
It's also not a solution that scales well.
If everyone is wearing adversarial fashion, then the networks will eventually get better at detecting them.
- The real solutions come from policy solutions, enacting a privacy law, enacting limits on law enforcement, enacting laws like Illinois BIPA, and before that BIPA is the biometric privacy law in Illinois and other laws to just stop, you know people from being able to use this stuff.
I mean, we've kind of, it's interesting.
I think we've reached a kind of a moment in our society where we actually don't think law could ever be on our side.
And I understand as somebody who works really hard to change policy and law you know why some people might, you know, feel this you know, privacy nihilism, but, individual solutions are not gonna get us there.
- So what will get us there?
And do we all agree that this kind of surveillance is a problem?
How you feel about surveillance might depend on how you feel about the people being surveilled.
For example, the FBI used facial recognition to identify January 6th insurrectionists, but it's also been used to surveil and track down protestors after Black Lives Matter protests.
- Lots of people say, well, if you're doing nothing wrong then why are you worried?
But I've yet to meet a person who actually stands by that argument in every context.
You know, maybe they think that's no big deal if the police have that information.
But what happens when the IRS has it?
We may trust different institutions to wield this power but none of us trust every institution that's wielding it to do so unchecked - We know from research on U.S. surveillance of Muslims after 9/11, that constant surveillance changes people's behavior.
They're less likely to participate in free speech activities like protests.
In fact, they're less likely to participate in any social activity like going to a mosque, or seeing friends.
So it's easy to imagine how these tools can become oppressive over time.
- We built these systems on the argument that we will never have any bad rulers who will use it for bad purposes.
I think that's a bad bet.
They're especially problematic for the people who've always been over-surveilled in our society.
People from marginalized groups or people who are trying to make change in the law or in our world.
- Some facial recognition systems are far more likely to misidentify racial minorities, which may have been due to too many white faces in too many data sets.
The technology is improving all the time but that's cold comfort to privacy advocates who say we don't have any evidence showing that surveillance makes us safer.
A 2020 meta-analysis of surveillance research found that the presence of CCTV cameras led to a 13% reduction in theft, but had no effect on violent crime.
And that study focused just on crime prevention.
We don't know if facial recognition technology is helping solve more crimes because it hasn't been rigorously studied.
- The police really don't have any regular requirement that they figure out whether what they're doing is working and that's a good place to start.
Things like transparency ordinances and accountability measures that we can agree on about for questions like, you know, did putting cameras up all over our public spaces actually help us reduce crime in the way that we would like to.
- Just because the technology is available it doesn't always mean we should use it.
I don't think constant mass surveillance by the private or public sector is a society we wanna move towards.
The European Union has created some specific guidelines for facial recognition technology that I think are the right approach.
For example, it can only be used as a tool to investigate serious crimes and not to constantly surveil the public.
But the U.S. is still lacking in clear federal policy that guides how we use this technology here.
And until that's solved, there will likely be a hodgepodge of rules that vary depending on where you live.
It might feel like mass surveillance isn't slowing down but how we respond to it, still an open question.
[suspenseful music] ♪
- Science and Nature
A series about fails in history that have resulted in major discoveries and inventions.
Support for PBS provided by:
Funding for FAR OUT is provided by the National Science Foundation.