Interview with
Jonathan Stribling-Uss
Media Democracy Fund Technology Fellow, New York Civil Liberties Union
Click for Transcript
🅙 In particular, you know, the framers of the Constitution rejected the idea of having general warrants where you would collect data from a whole neighborhood or be able to raid everyone's houses in a whole neighborhood and search through everything in order to find an individual. They want an individualized suspicion that you should only be able to look for the person that you think may have committed a crime or may have done something wrong to society. You shouldn't just be able to arrest everyone and then try to pick through them to figure out the bad person. That was rejected by the framers of the Constitution. And that's represented in both the First and Fourth Amendment of our Constitution. And so that's something that we should take very seriously. And facial recognition is a form of a general warrant because it's looking at everyone all the time. It's basically what's called a like a perpetual way of putting people into a lineup. Right. Who here would agree to be in a police lineup, anything? Could you just go down? If cops say, "hey, come down, stand in a lineup." Most people in society do not participate willingly or participate very uncertainly in a police lineup. But facial recognition creates a perpetual life where you're your photos being used as part of a lineup, whether or not you agreed to it or not.
Interview with
Emmanuel Mauleon
Police and Technology Fellow, Policing Project, New York University School of Law
Click for Transcript
🅔 I think we trade a lot of convenience for privacy. You upload your picture to Facebook and what does it do? It puts the little box around your head and asks, "do you want to tag this person?" What you're doing is training a facial recognition algorithm, so every time that it's like, "I recognize you," how does it recognize you? Well, because it has your data points memorized and then that data is being sold to a company that's training a facial recognition algorithm. It's not about using this thing or shouldn't use thing, but you should know what you're using. That's the big thing. Make your own decision about what you want.
Interview with
Albert Fox Cahn
Founder and Executive Director, Surveillance Technology Oversight Project (S.T.O.P.)
Click for Transcript
CUP: Is it possible to create a balance between surveillance technology and privacy?
🅐 Yeah, it’s completely possible. Right now if you look at the surveillance technology in Beijing, China, NYC, and Stockholm, Sweden it’s like we’re living in 3 different decades. In China you have a perpetual surveillance web that’s creating a social credit score where if you even jay walk there will be a facial recognition system that potentially identifies you as jay walking, tracks your identity, connects it to your wechat account - the equivalent of whatsapp - and automatically deducts a fine from your digital wallet. All in real time. Without due process, without a chance for you to even appeal the fine. Then in NY we have thousands of people who are tracked using facial recognition and all these new forms of surveillance. We have issues with our emerging congestion pricing system, but it’s not nearly as bad. Then in Stockholm, Sweden you have incredibly limited surveillance where let’s take their automated license plate reader program. This program is used to tool someone when they go into their equivalent of midtown. That data is kept for 30 days. They take a photo just of the license plate, they won’t even show the car, who’s driving it, or any of the bystanders. If law enforcement comes and says we need to know who’s license plate was tagged at what time they’ll tell them to buzz off. So you have incredibly limited surveillance. In each of those countries the same tech is available, the same tools can be purchased but they have different norms and different laws so to me that shows that none of this is inevitable, the technology doesn’t determine where we end up, it’s the legal discussion, it’s the political components.
Interview with
Vanessa Gibson
New York City Council Member, District 16
Click for Transcript
𝚂𝚃𝚄𝙳𝙴𝙽𝚃 Do you think the use of facial recognition technology by the NYPD should be banned outright? Why or why not?
🅥 No I don’t think it should be banned outright. I think that if we do approve it and allow it to happen, it should be, again, with safeguards and measures in place. Um, we are a city that is constantly evolving. We have 8.6 million people in New York City, and there have been constant efforts to terrorize New York City. There have been a number of assaults that have been thwarted because of the great work of the NYPD, and I recognize that, I recognize that we are at constant threat. We will never forget the terrorist attack on 9/11, and the thousands of lives that were lost, and the individuals living today that are still suffering. And that’s a real thing that happened to us. We don’t want that to ever be repeated in the history, we also want to make sure that we protect people. And I believe we can do both. They’re not mutually exclusive. We can protect New Yorkers, and we can respect their individual civil rights. We can do both, and we should do both. So if we allow facial recognition for the NYPD, I think it should have an impact policy, that would be available on the NYPD’s website. I think it should be incorporated into the NYPD’s patrol guide, just like the body camera footage is p-, is in their patrol guide. I think it should be a part of our procedures, and everything that we do as a city. And that’s for any other department, um, that would want to use facial, uh, recognition technology.