Episode 005 of EFF’s How to Fix the Internet
Abi Hassen joins EFF hosts Cindy Cohn and Danny O’Brien as they discuss the rise of facial recognition technology, how this increasingly powerful identification tool is ending up in the hands of law enforcement, and what that means for the future of public protest and the right to assemble and associate in public places.
In this episode you’ll learn about:
- The Black Movement Law Project, which Abi co-founded, and how it has evolved over time to meet the needs of protesters;
- Why the presumption that people don’t have any right to privacy in public spaces is challenged by increasingly powerful identification technologies;
- Why we may need to think big when it comes to updating the U.S. law to protect privacy;
- How face recognition technology can have a chilling effect on public participation, even when the technology isn’t accurate;
- How face recognition technology is already leading to the wrongful arrest of innocent people, as seen in a recent case of a man in Detroit;
- How gang laws and anti-terrorism laws have been the foundation of a legal tools that can now be deployed against political activists;
- Understanding face recognition technology within the context of a range of powerful surveillance tools in the hands of law enforcement;
- How we can start to fix the problems caused by facial recognition through increased transparency, community control, and hard limits on law enforcement use of face recognition technology,
- How Abi sees the further goal is to move beyond restricting or regulating specific technologies to a world where public protests are not so necessary, as part of reimagining the role of law enforcement.
Abi is a political philosophy student, attorney, technologist, co-founder of the Black Movement-Law Project, a legal support rapid response group that grew out of the uprisings in Ferguson, Baltimore, and elsewhere. He is also a partner (currently on leave) at O’Neill and Hassen LLP, a law practice focused on indigent criminal defense. Prior to this current positions, he was the Mass Defense Coordinator at the National Lawyers Guild. Abi has also worked as a political campaign manager and strategist, union organizer, and community organizer. He conducts trainings, speaks, and writes on topics of race, technology, (in)justice, and the law. Abi is particularly interested in exploring the dynamic nature of institutions, political movements, and their interactions from the perspective of complex systems theory. You can find Abi on Twitter at @AbiHassen, and his website is https://AbiHassen.com
Please subscribe to How to Fix the Internet via RSS, Stitcher, TuneIn, Apple Podcasts, Google Podcasts, Spotify or your podcast player of choice. You can also find the Mp3 of this episode on the Internet Archive. If you have any feedback on this episode, please email podcast@eff.org.
Below, you’ll find legal resources – including links to important cases, books, and briefs discussed in the podcast – as well a full transcript of the audio.
Resources
Current State of Surveillance
- How the U.S. Military Buys Location Data from Ordinary Apps (Vice)
- Under Surveillance: Examining Facebook’s Spiral of Silence Effects in the Wake of NSA Internet Monitoring (Journalism & Mass Communication Quarterly)
- The Secretive Company That Might End Privacy as We Know It (NYT)
- Amazon’s Ring is Perfect Storm of Privacy Threats (EFF)
- Forensics Gone Wrong: When DNA Snares the Innocent (Science Magazine)
- Strengthening Forensic Science in the United States: A Path Forward (National Research Council)
- The Crypto Wars: Governments Working to Undermine Encryption (EFF)
European Regulation of Data and Privacy
- The European Union’s General Data Protection Regulation (GDPR)
- European Parliament’s Personal Data Protection Fact Sheet
State Use and Mis-Use of Surveillance
- Wrongfully Accused by an Algorithm (NYT)
- Facial Recognition Linked to a Second Wrongful Arrest by Detroit Police (Engadget)
- Cops in Miami, NYC Arrest Protesters from Facial Recognition Matches (Ars Technica)
- One Month, 500,000 Face Scans: How China is Using A.I. to Profile a Minority (NYT)
- NSA Spying (EFF)
Flaws and Consequences of Surveillance
- Many Facial-Recognition Systems Are Biased, Says U.S. Study (NYT)
- The Chilling Effect of Government Surveillance Programs on the Use of the Internet by Muslim-Americans (University of Maryland Law Journal of Race, Religion, Gender and Class)
- Ringing Alarm Bells: A Study of Implicit Bias in Consumer Surveillance Device Use in San Francisco (Media Alliance)
Protecting Oneself from Surveillance
- Tech & Activism: How to Protect Your Privacy Rights (EFF)
- You Have a First Amendment Right to Record the Police (EFF)
- Digital Security Advice for Journalists Covering the Protests Against Police Violence (EFF)
- A Quick and Dirty Guide to Cell Phone Surveillance at Protests (EFF)
- How to Identify Visible (and Invisible) Surveillance at Protests (EFF)
- EFF’s Surveillance Self-Defense Guide: Attending a Protest (EFF)
Lawsuits Against Facial Recognition and Surveillance
- Williams v. San Francisco (EFF and ACLU Northern California’s Ongoing Case Against Police Surveillance During Black-Led Protests)
- EFF’s Amicus Brief on Article 19 and Privacy International re. Doe 1 v. Cisco
Surveillance and Black-Led Movements
- Abi Hassen
- Black Movement-Law Project
- Black Lives Matter, Online and in the Streets: Statement from EFF in the Wake of the Police Killings of Breonna Taylor and George Floyd (EFF)
- Digital Rights and the Black-Led Movement Against Police Violence (EFF)
Activism Against Surveillance
- ACLU Calls on Lawmakers to Immediately Stop Law Enforcement Use of Face Recognition Technology (ACLU)
- California Coalition Calls for Moratorium on State Gang Database (EFF)
- Join the Movement for Community Control over Police Surveillance (EFF)
- You Should Have the Right to Sue Companies that Violate Your Privacy (EFF)
- About Face (Coalition Commitment to Ending Government Use of Face Surveillance)
Other Resources
Transcript of Episode 005: From Your Face to Their Database
Danny O’Brien:
Welcome to How to Fix the Internet with the Electronic Frontier Foundation, a podcast that explores some of the biggest problems we face online right now, problems whose source and solution is often buried in the obscure twists of technological development, societal change, and the subtle details of Internet lore.
Cindy Cohn:
Hi everyone. I'm Cindy Cohn, and I'm the Executive Director of the Electronic Frontier Foundation. And like a lot of us here, I'm a lawyer.
Danny O'Brien:
And I'm Danny O'Brien. I work at EFF too, but I'm not a lawyer.
Cindy Cohn:
Then what are you, Danny?
Danny O'Brien:
I’ve spent so long with lawyers, I've kind of forgotten what I am. It's a bit like if you're raised by wolves.
Cindy Cohn:
Well, this week, we're tackling facial recognition, which will tell us whether you're turned into a wolf, Danny. In the last few years, face recognition has gone from a high-tech party trick to a serious threat to civil liberties. Companies are already touting the ability to turn any photo into a name and identity based on pictures, taking from private records and also the public Internet. Cities and police forces are being sold equipment that can identify and track citizens as they go about their business in real time. And then permanently record that information for later investigations or, as we've seen, misuse.
Danny O'Brien:
I think most people have yet to realize just how good facial recognition has gotten recently. I think it's reached the point where it's a perfectly reasonable thing to expect the software to do, that you can take a photograph of a demonstration or live video, and the facial recognition software will be able to pick out the faces from a crowd. All of the faces, or as many as it can, and then correlate those to a database. A database that could contain everybody who's put their faces up on the Internet in a photograph or even a profile picture. That's a reasonable thing to expect modern facial recognition software to do. And that's the pitch that's being given to law enforcement by commercial companies selling this technology, and at a pretty cheap price as well. This is getting to the point of being off-the-shelf software rather than an expensive service that maybe only the NSA can use or large companies can fund.
Cindy Cohn:
At the same time that facial recognition is getting really good in some ways, it's also still really bad in some others, and quite dangerous. Then the results are often terribly biased. They fail to identify nonwhite people, People of Color correctly, far more often than it fails with white people. It's often embedded in systems and structures and policies that are racist as well. AI and machine learning inferences can only guess about the future based upon the data you fed them about the past. So if the training data is slanted or racist the guesses about the future will be, too. And in addition, there's a growing body of research and a budding set of tools being developed to fool facial recognition. In COVID time, we're seeing even that masks are causing flaws as well. We already have seen the first couple of false arrests based on bad uses of facial recognition and more on the way. And it's only a matter of time before this spills into political protests.
Danny O'Brien:
This is a sort of paradox that we see a lot at EFF in emerging technologies. If the technology really worked as well as it's being hyped, it's maybe terrifying for civil liberties. But it's still bad, even when it doesn't live up to those promises because it fails in ways that the authorities refuse to acknowledge or mitigate against.
Cindy Cohn:
Yeah, so we're damned if it works, and we're damned if it doesn't.
Danny O'Brien:
Joining us today is Abi Hassen, co-founder of the Black Movement Law Project, who has been watching just how facial recognition can be misused to silence dissent and track legitimate protest. He's been a key figure in the campaign to place limits on the use of this anything-but-benign technology, where it's increasingly problematic on the streets during protests. Welcome, Abi.
Abi Hassen:
Thank you so much, Danny and Cindy, for having me.
Cindy Cohn:
For our purposes, of course, Abi is a lawyer as well. So how did you get involved in all of this? The Black Movement Law Project is near and dear to our hearts here at EFF because our colleague Nash was also one of the co-founders. But how did you get involved?
Abi Hassen:
So, I started my legal career through several kinds of circuitous routes around the law. First, as a labor organizer and community activist and community organizer, and kind of a mixed role, but heavily involved in labor. And then I ended up transitioning to a more protest-focused job where I got hired at the National Lawyers Guild and coincidentally, right, I got hired there about six months before Occupy Wall Street started. So I cut my teeth in the world of protest. Protest law and protest activity at a pretty opportune time for seeing some action, so to speak. And so, and then as that Occupy moment changed and Trayvon Martin, and then Ferguson, and then Baltimore started happening, I ended up going down to Baltimore to help with doing some legal support.
Abi Hassen:
I met some other folks, and we started Black Movement Law project out of that recognition that protest legal space needed some explicit Black leadership and to help try to develop that and help try to work also in places that didn't have existing legal infrastructure. Those mid-sized cities where we were seeing a lot of insurrection and protests. But they weren't like the Bay Area or Los Angeles or New York where there's long-established legal support communities, so that's how I got involved. And then, the surveillance and protest stuff just naturally came out of working with activists. Working with activists in this post-Snowden moment, and then working with activists... I mean, the dynamics, obviously even changed more during the Trump moment. But it just became clearer and clearer that digital security and digital self-defense became more and more on people's minds.
Cindy Cohn:
Can you give us just a thumbnail sketch of what Black Movement Law Project does?
Abi Hassen:
It’s been transitioning as the moments and the political moments have changed. And so, for those first few years, in 2015, 2016, into 2017, it was a lot of on-the-ground helping, really develop jail support and helping people doing basic protest training, which is, know your rights and how to set up your own legal support. How to help guide people through the system, and then volunteer management and stuff that goes along with that. And our model was really based on helping people locally figure out how to do it themselves and not being in charge of it and not try to run it.
Abi Hassen:
But I mean, obviously that went back and forth into lots of different places. And then, like I said before that, it became clearer and clearer that there was a need for digital security training. And we saw more and more police just overtly using social media. And there was all this stuff about Stingrays started coming out. And this is all again post-Snowden. So, it's like there was already a level of concern. And so, we actually partnered with EFF pretty early on in that time and started a series of digital security trainings across the country. And then, I ended up working with EFF also on training tools and stuff like that. And then, as the moment has changed yet again. And then, we got Trump. And then, COVID. I've been doing some remote trainings. I've been working on some research projects.
Abi Hassen:
I think that the BMLP as what it was in those early days, it's not functioning in that way. Because we're all doing other things primarily, and doing the BMLP stuff as a second job if you will. And so, we're trying to take the knowledge we have and turn it into tools, turn it into trainings. I'm actually studying now, I'm doing a degree in political theory. So, trying to bring that experience also into more theoretical spaces. So, it's kind of a mixed bag. But it's really still developing tools for social change, for political movements, out of our experience and collective knowledge.
Cindy Cohn:
I mean, I think that an organization that can move with the needs of the community is really a healthy one. I want to ask a little more personal question. What fuels you about this? Why are you passionate about this? Where does it come from inside you?
Abi Hassen:
I know that your question is not just about technology. So, just a little background. I did a degree in computer science, and that was my undergraduate education. In philosophy and computer science. And so, I've always been interested in these deeper questions of how the world works. And I think that learning about technology and learning about the interaction of technology and society, it's a helpful way to gain insight into how the world works and how politics works. But I've also always had a strong kind of, I don't know if it's like a moral intuition or what. But I feel like, what is the thing that I can do at this moment, that can help what I consider to be the problems in the world?
Abi Hassen:
And so, that's led me on this journey from labor to protest to politics. Just trying to figure out how these things work in one hand, and how to best intervene on the other. I think that might've, that was probably the crazy mistake that led me into law school like the rest of us. But then, it took me a long time to realize that the law is not what I thought it was. But it is also a venue for creating change. But maybe not in the way I thought it was before going to law school, if that makes sense.
Cindy Cohn:
I think that's a realization a lot of us make in law school. I mean, you're describing a journey that feels very much like mine, I have to say, in terms of just having this inner feeling like your job is to figure out where you can help and where your tools and skills will help the best. And that often does lead you on a, I wouldn't say circuitous, but it's a dynamic conversation that you're tending to have with your skills and the world, and what the world needs.
Danny O'Brien:
And as the non-lawyer, I think I feel like a sort of anthropologist here. It's actually amazing that passage between someone studying computer science, or being involved in the vague outline and intuitions about technology, and then, realizing that, that's an important perspective in law, too. Especially, in times of rapid technological change, where the technology changes how civil liberties need to be interpreted in the world.
Cindy Cohn:
How would you talk about the risks of face recognition being used by law enforcement during protests, or other public political activities?
Abi Hassen:
If we play out where the technology is going, and think about what capacities that could mean for law enforcement, especially coupled with an increasingly aggressive, and I think it's fair to say, anti-democratic, attitude in law enforcement generally. I mean, the implications for just plain chilling of speech, are tremendous. I just think about like... I just talk of it a little bit personally, like about my family. My father came here from Ethiopia, and a lot of my family is from there. Like for example, in the post 9/11 moments, just seeing family members tell their children, "No, don't take pictures of anything. We're not allowed to take photos, because we're Muslim." Right?
Abi Hassen:
Or like when Trump was elected, just seeing cousins and other relatives, children, teenagers, just terrified. They were born in this country. They have no reason legally, to be scared. So, if you think about, I mean, that's a specific community. Maybe that's a specific context. But that has real effects on people's willingness to participate. In what is considered a right of an American, which is to be an active political participant in society. And that's just a spectrum, maybe that's on one end of the spectrum.
Abi Hassen:
But at every level, that increased capacity of law enforcement to know exactly where you are and what you were doing at all moments in your life, coupled with a political system and legal institutions that are at times just overtly anti-immigrant, anti-democratic, anti-left. So, I think that it's one more element in a pretty scary trajectory for the kind of active public participation that I think we would want to see, in a world that is clearly not in a stable and healthy place.
Cindy Cohn:
Yeah. I think that's right. It becomes more important for people to be able to have a little zone of privacy around these activities. And so, one of the issues that I think we're struggling with around facial recognition technology and the law is that historically, the law and the courts have held that when you're out on the streets, you're in public and you don't have a right to privacy. And this is something that we're struggling with a little bit. And I wondered how you think about that, and how we might need to push the law a bit, or address that problem in protecting people against face recognition.
Abi Hassen:
Historically, our constitution has been amended basically every generation. I think the last time was sometime in the eighties probably, is that right? But it's been quite a long time since anything like that has happened. And there's a lot of reasons for that, I don't want to just throw that out there lightly. But if we think about the invention of the Internet and its ramifications on society and think that we've done nothing at all… We've done nothing commensurate legally with the massive social change and economic change, frankly, that that creation has wrought. And so I think we need to be thinking big about this, right?
Abi Hassen:
Like I think that the European counterpart, the GDPR, is something that is at least putting a stake in at a slightly higher level. But I think we need to think big because, something you mentioned just a moment ago, the zone of privacy, I think is something that we really should engage with and take seriously. Right. Because it's truly, if we think about what the contours of privacy were, even at the time of the Constitution or whatever, it's different in different arenas, right? Like maybe your neighbor was more likely to spy on you walking out or whatever, but certainly vis-a-vis the state, the zone of privacy has massively shrunk.
Danny O'Brien:
I find that take really interesting because you mentioned the GDPR, and I think one of the things that makes the GDPR so powerful comes from almost an accident of history in the European Union. Because the European Union in some ways is a young country too, although it's made up of far older nations. And just by an accident of timing, the main constitutional document was crafted in 2007, the Treaty of Lisbon. And just because of that timing, it managed to embed into the fabric of the European Union, a relatively new right, which was the right of data protection, which is seen as separate from a privacy right. And so I do think that sometimes you can create, there's an importance to creating something that has the feeling of a constitutional-level decision, but is about the modern world.
Cindy Cohn:
The other thing I think about with regard to this question about, “How do you get privacy in public?”, is that the specific context we're talking about here is kind of core First Amendment context, right? The right to free speech, the right to petition government for grievances. And to me, face recognition applied to people when they're out engaging in public protests ought to be protected by the very strong framework of the First Amendment, rather than the relatively weak one of the Fourth. We still have a ways to go -- the courts haven't bought that argument. I keep trying to tee it up so we can raise it. But I think there are several ways to get at this. And one of them is thinking big about where we want to go in the future. And the other is maybe thinking a little harder about what's already, the tools that are already in our constitution.
Abi Hassen:
Yeah. I'm right with you on the first statement. I think that, I don't know the current status of success, but one of the things that I feel like is on the table is the assembly, First Amendment assembly, is just very underdeveloped, right? Like what does assembly actually mean today?
Cindy Cohn:
Yeah. I agree. Assembly, association, all of the things that I think we're getting at when we're thinking about, why do people come together in the streets to try to demand change? There are actually words in the Constitution that referenced that, but we haven't given them the kind of life that I think we could, but I agree.
Danny O'Brien:
And one thing that we were talking about earlier, and I think you brought out, is this: face recognition doesn't need to work to have a chilling effect on protest. And I think it's often very hard because people know that this invisible technology is around, and they're worried about what they can and can't do. And that actually prevents them from protesting, unless they're absolutely desperate. When you're doing security trainings and explaining the capabilities of law enforcement, how do you balance that line between explaining what's possible and not scaring people so much that they don’t actually exercise their constitutional rights?
Abi Hassen:
That's always a struggle and that's a dynamic that just... What I generally do is encourage people to try to think about, one, put all their cards on the table in terms of thinking about their risks and being very aware of them. But I also think that it's also important to understand the institutional prerogatives of law enforcement and try to take that into account. Right. Because especially like, again, especially after Snowden, everyone was convinced that the NSA was spying on them. And you have to have conversations where like, look, the NSA has X number of analysts, right. They're not looking at you. Right? Or, these are who their priorities are. Right?
Abi Hassen:
Like the people who are getting their laptops taken… And obviously that “you” is contextual. Because maybe they are, right? So, that's why it's important to understand your positionality. Like, to understand your positionality and the ideology and priority of law enforcement, but yeah. So, that's an important conversation to have, but the scary thing is the more it becomes turnkey, the more it becomes these AI systems that are just spitting out suspects to law enforcement, the more integrated those systems become, the less actual work law enforcement has to do, which means the more everyone actually is at risk, regardless of the institutional capacities.
Cindy Cohn:
Another issue that EFF spends a lot of time talking about are these. Who's selling this to government? Who's trying to make this as turnkey as possible? And we're seeing increasing public-private partnerships around surveillance generally. And face recognition I just don't think is very far behind. And right now I'm thinking specifically of Ring, Amazon Ring, which partners with law enforcement to promote its cameras to homeowners, and then suggests that people share their feeds with law enforcement. Or the case that EFF is litigating now, the Williams case, where a rich guy in San Francisco bought a bunch of cameras and gave them to local business districts, and then said, no, the cops won't have access to this. And then we discovered that during the Black Lives Matter protests earlier this year, and even the pride parade, the cops did get access in real time to those cameras. EFF and the ACLU are suing over that one. But how do you talk to likely impacted communities about these kinds of things?
Abi Hassen:
These companies are profit-seeking enterprises and a depression might be... It really... The dynamic of... Loss of advertising revenue, where are they going to go? It seems pretty clear that law enforcement... Because we're not cutting a lot of law enforcement budgets, and county government contracting is always a good source of income. I mean, how do we talk about those things? I think you have to have those conversations about... The bigger the threat, the bigger the coalition needs to be to counter it. You can't have... In some sense, it's an opportunity to say, "These aren't siloed conversations." Black Lives Matter is not... Or put something that's viewed as anti-police or police trying to counter police violence can't be separated from basically the political economy of surveillance.
Abi Hassen:
These actually are completely overlapping and intersecting problems that the threat is merging and the response has to be merging also, and it's an opportunity to say, look, I don't have the answer, we can't picket Amazon to any effect right now, but we can support the workers at Amazon who are trying to organize a union. And we can support our congressperson who is working, who is trying to try to do a report on monopolistic practices in Silicon Valley. We can try to build a kind of political consciousness about anti-monopoly. We can try to create coalitions where tech people who are focused on the technical threats or the civil liberties threats are learning from and cross pollinating with people who are working on other issues to expand our coalitions.
Danny O'Brien:
Yeah, and I think you have to spell out these links, too, and that's one of the big challenges. Because for a long time, there was this sort of strange and arbitrary division between people worrying about government surveillance and people worrying about corporate surveillance, and of course the last few years have shown that those are the same problem. And one of the challenges we face, and I think this is true in a lot of spaces, is that first of all, geeks who are our base love new technology, so they're the first adopters of things like Ring and surveillance cameras, and also, people worried about their safety and who don't trust law enforcement, or aren't being served by law enforcement, also invest in these surveillance programs.
Danny O'Brien:
We have this thing where we're actually talking to people who are going to, who should be the most knowledgeable, who should be the people who are most concerned about these alliances. And they're actually being drawn in to being complicit with their own neighborhood surveillance. And I have to say I was sort of worried about this, but was really impressed by how quickly everyone gets it. Once you paint that… I think we saw that in San Francisco with this sort of public safety surveillance program, and I think the activism around Ring is also kind of going that way.
Abi Hassen:
Yeah, I mean, I think it's like I found part of why I'm doing, why I've taken this last career trajectory, doing political philosophy or political theory, is, I found myself teaching technology to activists and teaching politics to technologists and teaching... and teaching both to lawyers. And I feel like that kind of integration is what we need to do. Like, never do a technical demonstration without some kind of hook into the broader political frameworks. Use all of those things as opportunities to do more and to expand.
Cindy Cohn:
I really like that. What I hear you saying is, in some ways it is kind of all connected, our standing up and helping the folks inside these companies who want to organize, and whether that's bringing in a union or otherwise have a bigger voice in what's going on, is part of how we help protect protesters from these technologies. Because if we empower folks to have a bigger voice in this, and they can begin to have a bigger say, and what are the kinds of tools that are being developed, and who are they being sold to? The thing that is especially troubling to me about both Ring and our San Francisco cases is that the technology is actually, it's the bleeding edge of the surveillance.
Cindy Cohn:
Our Williams case... The guy just handed out the cameras. That wasn't actually a business proposition. And with Ring, again, the cameras are not that expensive, then they're making it really easy for people to get them. I'm a little worried that cool technology is the foot in the door to a really awful future. And that's not the first time we've seen it, but I really feel that right now.
Danny O'Brien:
Is this primarily a theoretical threat at the moment, that people are worried about facial recognition being misused?
Cindy Cohn:
We’ve already seen at least one example of face recognition technology being used to arrest the wrong person. Abi, you want to talk a little bit about that case?
Abi Hassen:
Yeah, you're referring to, out of Michigan, Robert Williams’ case. It's one of those things that I think paints a potential future that is quite bleak. It's a case where, it's a Black man in Detroit arrested because the computer got it wrong. It was a facial recognition algorithm that cops, the next day, the quote from the cops -- the ACLU is suing the police there on behalf of Robert -- and the quote from the cops the next day is, “the computer must have got it wrong.”
Abi Hassen:
And I think, one, it's telling that he's from Detroit and is Black. And the story basically is, the computer got it wrong, the police did absolutely no police work to verify anything, they just went up and arrested him because a computer spit out his name. And in some sense, if we don't fight these things, if we don't change these things, this is the kind of future we're living in. Where, yes, we already have cases where people are put in prison or falsely accused for all kinds of reasons, eye-witness testimony, or DNA, whatever. What's truly bleak is the concept of, the police now just are getting a name from a computer and arresting someone. And that's the beginning and end of it.
Cindy Cohn:
The fact that we do this with other things, really, it shouldn't be an excuse for just doing more of it. Some of the argument I hear so far is, well, humans misidentify people. And well, yeah, but the answer to that is to make the police do more work than that, not to just have another way that police don't have to do the full work.
Cindy Cohn:
And this story is especially scary because they've showed up and they handcuffed him in front of his two little daughters who were two and five years old. And his wife had to go to his work and say, "Look, please don't fire him. He's been wrongly arrested."
Cindy Cohn:
And now his DNA and all the other stuff are going to go into the databases. I mean, some of this is also, the machinery once you get arrested is really, really damaging to people in the long run. And so we need to be a lot more careful on the front end, not less.
Danny O'Brien:
And I think it also points to people's treatment of new technology, too, in that the idea that a computer can be faultless, and therefore you can just obey what it suggests, is something that I don't think that technologists, and include myself, all of us, do enough to disabuse people of. These systems aren't perfect in the way that you would want them to be perfect. They're very good in certain directions, but those directions don't necessarily point in the way of justice.
Abi Hassen:
Someone like Trump can come along and say Antifa, which is just a loose concept. But its looseness is exactly its purpose, because it allows law enforcement, or it allows a section within law enforcement, to enact a political agenda.
Abi Hassen:
And it's a moment. Our law has built up that capacity through terrorism and gang laws, primarily aimed at minority communities to the point that now they can, with Antifa, it can be done to a political community that is no longer a minority racially. So, that's a capacity that has been built and it's only augmented by the technical capacity of creating those networks and spitting out a list.
Cindy Cohn:
Yeah, and so much surveillance works like this. You create an other, a bad guy. After 9/11, it was Muslims. It's no surprise to me that Black Lives Matter activists have been suggested as being on the terrorism watch list for a long time. And now of course, you're right, it's Antifa, which is even less of a thing in terms of a cohesive movement.
Cindy Cohn:
But you create this category of people who can be subject to intense surveillance, intense tracking, intense watching. And then of course, the political pressures, the political influences, are going to start using that category for whoever they don't like, or to shore up their base by creating a hated other.
Cindy Cohn:
So, surveillance is just one of the tools that gets used in one of these systems where we stop thinking of everyone as having equal rights, but we start creating classes of people who, by virtue of being a member of something, or alleged member of something, just don't have rights. And I think you're totally right that this grows out of the way that law enforcement has achieved the ability to treat gangs as if they're not citizens.
Danny O'Brien:
And what, I think, ties this together is that what we're seeing is a process of law that isn't about what you've done, but about who you are and who you associate with. And that's an error. It's an obvious error in justice and due process, but it's the sort of error that can be really exploited if you're selling facial recognition systems and algorithms. Because those algorithms are designed to say, "Here is a cluster of people, and here is the evidence that connects them together."
Danny O'Brien:
And we shouldn't be basing judicial decisions on the clustering nature of people, because there's a right of association. And the clustering algorithms of systems that aren't designed to deal with the subtleties of mapping out those systems.
Cindy Cohn:
This really reminds me of the work that we've done around Cisco and the Great Firewall in China. Because Cisco sold a specific module to, at least the allegations are… Well, that's not even allegations, we have evidence. We have PowerPoint slides that show that Cisco sold a module to the Chinese government that they touted as being really great at being able to identify who's a member of the Falun Gong, based on who they talk to, where they go, and what they're looking at online. Because of course, the Great Firewall watches all of these things.
Cindy Cohn:
And essentially, what adding face recognition into the government's arsenal here does is, it helps make that kind of a system much more powerful and much more ready to be deployed against people who are engaging in political protests. Just like it could be deployed against identifying, in the context of China, a religious group. So it's very dangerous.
Abi Hassen:
Conversely, right, if it's working, it identifies them. And even if it's not working, it chills them.
Cindy Cohn:
Yeah. And that gets to the whole other side of things that we don't have time to talk about. What does transparency look like? What does accountability look like in these systems? We'd like to ban face recognition, but as you mentioned before we even got on this call, the face recognition’s just one of a suite of tools that law enforcement now has at their disposal that is pretty dangerous.
Cindy Cohn:
Making sure that we can have access to those tools, that we can unleash our techs to figure out how it actually works -- what is it looking at, and what was it trained on? All of those kinds of things are an essential part of how we need to think about law enforcement technologies before they get adopted, not trying to paste it on afterwards.
Danny O'Brien:
I mean, how do we fix this? It's the easiest question to get, but what is the way forward here? Because I think what you've described is a long-term problem that's embedded in the very direction that law and law enforcement has been, unfortunately, stumbling towards for decades, augmented by a technology that is rapidly improving. How do we navigate this? What would be on your list of things that you would want to achieve to stop such bad consequences from expanding?
Abi Hassen:
We have to keep doing what we're doing at the local level, at the state level, at the federal level, at the international level. We have to keep fighting things as they come up. We have to hold things at bay while we can. But in some sense, I worry that that's not enough, right? Because it's not just facial recognition. It's institutions. It's the structure of society in a lot of ways. And facial recognition is just another tool in that existing structure. And so, I think that we need to fight the fights that we have now so as to be able to fight the bigger fights later.
Danny O'Brien:
So this is a question of sort of holding things back. And this is the benefit of putting in transparency, putting in actual bans of facial recognition. This is to sort of hold this anti-democratic technology at bay so we can fix the bigger problems that allow it to be used.
Abi Hassen:
We need to be able to build ourselves space to build, right? I think that so much of the dynamic is just fighting the repression that keeps us from being able to figure out how to get out of it, right? So, I do think we should... I think we should ban a lot of things that... a lot of the law enforcement tactics and tools and ideologies need to be, I mean, we can't just ban them because they're institutions. But if they're asking for a new toy, we can try to ban that before it becomes integrated completely into the institution.
Danny O'Brien:
Thinking about what can be done, not just on a personal level but as part of a community of technologists, I think so much of this technological adoption is far away from any kind of evidence-based adoption, right? That it is literally sold on the glossy brochures and snake oil of surveillance manufacturers. I mean, we have someone on EFF staff who goes to these conferences where these are sold and comes back with catalogs, which is half spooky and scary for us to look through, and half us just going, "They're just lying, right? It can't do what they're saying it could do."
Danny O'Brien:
So, I think that one of the useful tasks that technologists can take on this is, calling out the bullshit when they see it. And you can do that in your own community. You can do that in your own neighborhood, because, often, surveillance is sold as a good, it's not hidden. It's like Ring, it's sold to a community as something that will improve their safety. And when it's out in the open like that, I think that there's some real benefits in challenging its capabilities and showing that its results aren't what people would ever want.
Abi Hassen:
I'm so glad you said that, because I think that that's really key at the social level point. If we think about the history of, so you guys are probably familiar with, I think it was like the 2009 NIST report on forensic science. Where looking at these things like bite mark evidence, burn pattern experts… It's full on phrenology for law enforcement, right? And engaging... Like I recently... COVID canceled it, but I was going to do a panel a few months ago with a biologist. And trying to build a connection between... because this is someone who actually knows how DNA works and actually knows population genetics. And bringing some of those kinds of crosscurrents -- we aren't engaging the real scientific community enough, I don't think. And maybe they're not engaging us enough, as well. Building... And that's part of what I was saying before about de-legitimizing. We have to delegitimize bad science.
Abi Hassen:
We can't give that up, but to do that, we need to engage with the real side. We need to engage with the technologists who know what they're talking about and understand, and have an inkling towards this kind of understanding, to show why these things are illegitimate or why they're not doing what they're saying they're doing or why it's... That's something I'm very interested in, in figuring out how to build some of those bridges. Because the project of, these investigatory projects of law enforcement, they're not actually investigations in that scientific sense. They are largely just a way of justifying something that you've already decided. They're quite the opposite. But as a defense lawyer, I can say that, but it doesn't have the weight that a scientist might have, for example.
Cindy Cohn:
Let's also talk a little more about what it would look like if we got it right. What I'm hearing is you're unhappy with the decision that the government makes. You can go out and protest and that doesn't go into your permanent record. What other things do you think about? Let's assume a world in which we get all of this right. What does it look like for somebody like you, or somebody who you're advising, Abi?
Abi Hassen:
Well, that's a very hard question, but in the realm of protest, I would want to see a world where people are much freer to organize together, to build new institutions without fear. To create new ways of working together and... As we talked about before, this kind of freedom of association, I would like to see that be real. I'd like to see that be real beyond even just marching in the street, but real in a way where we are organizing in the world to make our lives actually materially better. And that like just because that might be... we're using political mechanisms, we're using legal mechanisms, we’re using protests mechanisms. We're doing all kinds of things, but... because I don't want to say like, "Oh, we're just going to solve all the problems." There's no end state to history. We're engaging…. but I don't think that the police should have a place in that. I don't think that policing as we know it is a thing that we should have, frankly. But it especially shouldn't be the institution that stops people from making their lives better.
Cindy Cohn:
I wonder, what would it be like if when we went out to go and protest, we had a social service agency, the protest protection society that came out and made sure that things like putting up the barriers and making sure that traffic gets redirected, maybe those things, and making sure that if somebody is misbehaving, they're ejected. If we had something more like bouncers and social service people who attend the protests, rather than people who are engaged in trying to stop, and create accountability for, crime, because protesting isn't criminal.
Abi Hassen:
Yeah. I guess what I'm saying, we can get to the criminal part later because that's a whole other thing, maybe we don't have time, frankly. But in some instances, I don't want to be protesting, I want to be organizing, I want to be building something.
Cindy Cohn:
I love it. Your future is even better than mine. Go on, go on.
Abi Hassen:
Well, but what I'm saying is, a hundred years ago, the so-called protests were sit-down strikes. Shutting down the economy to end the robber barons from paying you a penny and you’re living in a company town or whatever. So our country has often forgotten, or not even taught, pretty brutal labor history. And that wasn't protesting, that was organizing. That was saying, "It's not right, and we're going to take what's ours. We're going to build an institution to have power to make our lives better." And the police and the Pinkertons and what have you were the pioneers of today's surveillance technology.
Abi Hassen:
The Pinkertons literally created the first database, which was the rogues' gallery, the first facial recognition system. And it was used primarily, or at least largely, to fight union organizing. I guess what I want to say is, what I want to see is us, we need to, one, fight these things so that we can have even the groundwork necessary to build something better, because right now, they're nipping it in the bud. So that's as close to... I don't want to paint a utopia because I don't know if that exists, but I do want to say that we need to be able to change things and build new things, and if we keep going on this path, they're not going to let us build anything.
Cindy Cohn:
I really love this, because I think it puts the street protests we're seeing in their place, which is, one of the few remaining tools of a society that is desperately headed off the rails, but also shutting down all the ways in which people can make it better short of that. By the time you're protesting in the street, things have gone terribly, terribly wrong. And what I hear you saying is, let's get to a future where we don't even have to get to that place because we've actually set the balance right at a much higher level with real ability to organize and make change that doesn't require us to get to that place where we have to take to the streets.
Danny O'Brien:
I often find bans to be a clumsy way of dealing with technological development, partly because it's not always clear what you should be banning ahead of time, and that are were ways of implementing the same thing that evade the ban, but also because that technology is always going to be around. Do you see… In these futures, what do you see the role of technologies like facial recognition being? And how do you think they should be controlled in a democratic society?
Abi Hassen:
Honestly, I feel like you've answered your question, just with the question, because you said what we need is a democratic society... I don't have a better answer than democracy, having an actual say in how our society is constructed. Part of the reason we're protesting on the street is because we don't have a say in actually changing things. We're told, "Oh, go and vote every four years." Sure, we should do that, but that's not enough. And so I think that there's a lot of frustration and people are saying, "Hey, things are getting worse, what can I even do but just yell in the street?"
Abi Hassen:
And so what we need is democracy, what we need is strong institutions that are democratically controlled by the people who are part of them that have power. And right now, when we're talking about the use of these technologies, when we're talking about anything... I guess the simplest way to say it is we need to reestablish or establish a commons that is the space of the people online, or the space of the people in the world as it relates to digital technology, and that needs to be ours, and we need to have control of it. It shouldn't be just that Amazon and the cops control how we live in public space. We should have control of that. And so yeah, the answer I would say is yes, democracy.
Cindy Cohn:
I think of transparency, making sure that that law enforcement has to tell us when they're looking at this, that we have a voice to say in whether they get to see, then we get to see what's really going on. Transparency reports after the fact are not really what we're talking about here, we're talking about pre-purchase transparency. And that has to involve both law enforcement and the companies who are providing the information, and then community control and input at every level, and then having accountability in the courts.
Cindy Cohn:
So when things go wrong and when the upfront transparency and control aren't working, you have the after-the-fact ability to create accountability and set things right. And I think of things like private rights of action and strengthening the Fourth Amendment and the First Amendment right to be able to declare something has been improper or throw out evidence. I'm all strategies, so I think of those as the things we're doing that are going to try to set the table so that we can have democratic control.
Danny O'Brien:
Abi, thank you very much. It's great talking to you.
Cindy Cohn:
Thanks, this has been great fun.
Abi Hassen:
Yeah, it's been really fun. Thank you so much.
Danny O'Brien:
Wow, that was one of those conversations that I didn't want to end because there was so much to unpack. I think one of the things that immediately stuck out for me, as a bit of an Internet utopian, optimist, is this very dystopian idea of face recognition as almost the opposite of what we want from technology. I always saw Internet as a way to help organize people and create new institutions and ways of cooperating together. And Abi makes this point that facial recognition is this anti-organization technology. It actually dissuades people from collectively acting.
Cindy Cohn:
Yeah, it's a really good point. I also liked how he ties the whole thing together with the broader movement for social justice and brings in his labor background, and really is talking about protests, not because we care about protests, but because we care about the bigger work of trying to make society better. So he really forced us to broaden the conversation from just the narrow topic of face surveillance at protests to the bigger “why we care about this.”
Danny O'Brien:
Yeah. And I think Abi pulled me out of the shortsighted way I view facial recognition as being particularly pertinent to protest, because he made this point that protest itself is a democratic failure mode, that no one immediately thinks of protest as the first step that they should take. And going out in the streets… It's only when other ways of speaking out or being able to change your environment have failed, do you go into this. So if we're going to think about civil liberties more broadly, and digital civil liberties, we have to think about what they're doing to everything else.
Cindy Cohn:
Yeah. The other thing I really like about Abi is that he comes at this as a criminal defense lawyer, and then, Danny, you made this really great point that ultimately if you're thinking about associations and assembly, these prosecutions are about who you are and who rather than what you did, and that's particularly dangerous. And it reminds me of why we care about metadata, right? And some of the fights that we have against not the local cops as much as the National Security Agency, but also local cops as well. That's what metadata does. It may not give the content of what you're doing, but it says who you are, who you're talking to, and who you associate and assemble with.
Danny O'Brien:
It lets police or the intelligence services construct this case from that association, when association should never be a crime in itself. The other thing Abi reminded me about was when he talked about evidence-based reform, and the need for a coalition of academics and computer scientists who can speak out about when these sort of facial recognition systems are just snake oil and what their failings are. And that really reminded me of the successful coalition we had on the war on encryption in the '90s and early 2000s, where trying to break encryption was presented in the same way. And also the danger of all our communications being encrypted was played up as this huge threat. And what we were able to do collectively is bring in the computer scientists and the academics to highlight where the hype was, and what was actually practical, and what could actually achieve change.
Danny O'Brien:
And the more I think about it, the more I think about how powerful that is in all kinds of police reform. That ultimately, I think when people are talking about changing the police or moving away from policing, what they want is something that's more effective in achieving what people want from law and order. And I can well believe that if we really started applying that reasonably to this new digital space, the institution could become unrecognizable from what we have now.
Cindy Cohn:
Yeah. I think that we have a lot of police work that really, we see the police really doing the same things over and over and over again, without the intervention of, is this actually working? Is it serving the community, or is it not serving the community? And what would serve the community better? Because nobody's arguing for an unsafe community, but I think there's a lot that could be gained from applying the same kind of scientific study to police tactics that, I think you're right, that we tried to bring so much to the encryption debate.
Cindy Cohn:
I think there are reasons to be hopeful about this. I think that we have pretty quickly been able to convince a wide swath of Americans that face recognition in the hands of police during protests is really a problem. I think we're helped a lot by that because it's pretty creepy and it's a pretty easy lift. But we're already seeing places across the country beginning to do bans on it and buying us the kind of time that Abi talked about to be able to sort out what we want to do and bring in all the other kinds of fixes that we talked about, like transparency and community control and these kinds of things. So, it would have been better if we could have gotten in on the ground floor before police departments had this technology, but as we know from the horrible story out of Detroit, that the police are already using some of this kind of stuff. But it's not as late as it's been with some other technologies, and so there's room for us to begin to really fix this.
Danny O'Brien:
It's always going to be difficult to stop an attractive piece of technology like this from falling into the hands of law enforcement, but we've made a good start. And with folks like Abi fighting for this, I think there's a real chance that we can fix this. Well, thanks for listening and see you next time.
Danny O’Brien:
Thanks again for joining us. If you'd like to support the Electronic Frontier Foundation, here are the three things you can do today. One, you can hit subscribe in your podcast player of choice. And if you have time, please leave a review. It helps more people find us. Two, please share on social media and with your friends and family. Three, please visit eff.org/podcast, where you will find more episodes, learn about these issues, you can donate and become a member, and lots more.
Danny O’Brien:
Members are the only reason we can do this work, plus you can get cool stuff like an EFF hat or an EFF hoodie, or even a camera cover for your laptop. Thanks once again for joining us. If you have any feedback on this episode, please email podcast@eff.org. We do read every email. This podcast is produced by the Electronic Frontier Foundation with help from Stuga Studios. Music by Nat Keefe of BeatMower.
Published December 01, 2020 at 09:22PM
Read more on eff.org