Kade Crockford, the director of the Technology for Liberty Program at the ACLU of Massachusetts, discusses technology, surveillance and civil liberties with the Science, Technology, and Public Policy program (STPP). October, 2022.
Transcript:
0:00:00.9 Ben Green: Okay. Good afternoon, everybody. Let's get started. I'm so excited to welcome you all here for today's session. My name is Ben Green. I'm an Assistant Professor in the Ford School, a post-doctoral scholar in the Michigan Society of Fellows, and an affiliate of the Science, Technology and Public Policy Program, or STPP. STPP is an interdisciplinary university-wide program dedicated to training students, conducting cutting-edge research, and informing the public and policy makers on issues at the intersection of technology, science, equity, society and public policy. If you'd like to learn more about STPP, you can do so at our website stpp.fordschool.umich.edu. Before I introduce today's event, I wanna make one quick announcement. For students interested in our graduate certificate program, there will be an information session on October 19th at 4:00 PM. If you're interested, please sign up on our website. The next deadline to apply for the certificate is November 1st.
0:01:06.6 BG: And now for today's event. I'm so thrilled to have Kade Crockford here as our guest speaker. Kade is the Director of the Technology for Liberty program at the ACLU of Massachusetts. This program aims to use unprecedented access to information and communications technology to protect and enrich open society and individual rights by implementing basic reforms. In this role, Kade recently led the ACLU of Massachusetts's Press Pause on Face Surveillance campaign, which has thus far won the passage of a state law regulating police use of facial recognition and eight municipal bans on government use of face surveillance technology. In other endeavors, Kade's blog, Privacy Matters, discusses the latest news regarding policing, surveillance, and privacy, and the terror and drug wars' impact on liberty both abroad and at home. Kade is also a co-founder and manager of the ACLU of Massachusetts's Data for Justice project. I've known Kade for several years and have always been inspired by their commitment to justice, tenacity in facing entrenched power structures, and vision for a better world.
0:02:17.9 BG: Kade will be in conversation with Molly Kleinman, STPP's Managing Director. Molly oversees the day-to-day operations and provides strategic direction for STPP, including our Community Partnerships Initiative, which works with community organizations to provide research and other support that helps them engage in technical and policy advocacy. In addition to her role at STPP, Molly is involved in local advocacy of her own. She serves as Chair of the Ann Arbor Transportation Commission, is an elected Trustee of the Ann Arbor District Library, and is a member of the Coalition for Re-Envisioning Our Safety, a group that is organizing for unarmed, non-police crisis response in the city of Ann Arbor.
0:03:01.3 BG: This event is hosted by the Science, Technology, and Public Policy program and the Ford School of Public Policy. It is co-sponsored by the Center for Ethics, Society, and Computing, or ESC, the Civil Rights Litigation Clearinghouse, the Arab and Muslim American Studies program, and the Science, Technology, and Society program. I also wanna thank our STPP staff who made this event possible, Kristin Burgard, Mariam Negaran, and Annabella Vidrio. Kade and Molly will talk for 40 minutes, and then we will open it up to the audience for questions. For those of you with us in person, you got a note card and pencils when you came in. You can write questions on your card, and Annabella will come around to collect them. If you need more note cards, you can just flag her down. For those of you watching online, you can send your questions to [email protected], and Mariam will pass them on to me. And without further ado, let's welcome Kade and get started.
[applause]
0:04:04.5 Molly Kleinman: Awesome. Thanks, Ben. Thanks so much for being here, Kade. I have so many questions for you, and I think a lot of them end up centering around questions of how to regulate and control the impact of surveillance technology, how to successfully advocate for the kinds of regulations that we think we need, you think we need, and then also how to use technology to advance positive agendas. But to start, I would love to hear... Just tell us a little bit about the Technology for Liberty project, and also maybe the Data for Justice project, which it sounds like is maybe the flipside of that coin.
0:04:37.3 Kade Crockford: Sure. So first of all, thank you for having me. It's really nice to be here. I really like Ann Arbor. And I just wanna thank everyone involved, especially Kristin, who's done a tremendous amount of work to make sure that I'm comfortable and happy everywhere I go, so thank you. So the Technology for Liberty program we founded at the ACLU of Massachusetts about 10 years ago. And the initial goal of the program was to ensure that the law keeps pace with technology wherever technology rubs up against civil rights and civil liberties issues. So in the beginning, that was really mostly around issues related to privacy and surveillance. We had been seeing for many years that, on the government side and on the corporate side, the decreasing cost of storage, computer storage, data storage, the increasing power of computing, these two things were combining, along with a failure of lawmakers at the state and federal level to really do anything in response to those two pretty profound and dramatic changes, all of that was leading to a situation in which power asymmetries were growing really, really fast, between ordinary people and the government, and between ordinary people like those of us in this room and powerful tech companies like Google, Facebook, Amazon, et cetera.
0:06:03.9 KC: And so we realized that somebody needed to step into this void to find out what was going on behind the scenes at police departments and other government agencies with respect to how these institutions were using new technologies to collect information about people, how they were using that information in new ways, and try to figure out what we think the response should be among lawmakers and then, similarly, try to understand what's going on behind the scenes at big tech companies and how, people have called data capitalism, some people have called surveillance capitalism, I just call it the business model of the Internet, is impacting our rights, our choices, maybe constraining our choices in ways that we don't totally understand, showing us a picture of the world that is not neutral at all and may not be the one that we think we're accessing when open up Google to look for information or search for jobs or whatever.
0:07:09.2 KC: So we realized that lawmakers need to step into this area and pass some laws to ensure that basic rights keep pace with technology. And in the beginning, again, that was really simple stuff like we all carry these devices around in our pockets, and were doing that in 2013 when we started the program. But at that time, police nationwide did not need a warrant to ask our cell phone companies where we were three weeks ago, six months ago, a year ago, based on the information called Cell Site location data that our cell phones communicate to cell phone towers every second of every day, that cell phone companies retain and keep for long periods. There was no law on the books in Massachusetts regulating how police could use facial recognition or any of these new technologies that, for a long time, were more like the kind of purview that shows CSI or the military or the CIA or something, but increasingly are being, and were being, used by local police. So basically, we wanted to make sure that the law kept pace with technology.
0:08:18.0 KC: And then the Data for Justice project is really about using data science to advocate for some other important law-reform goals that we have at the ACLU, primarily in the area of police practices, looking at, for example, how to use government data sets like stop and frisk data or drug prosecution information to show people what's actually going on in our criminal legal system so that, for example, people can see the vast majority of people who are being arrested for marijuana possession are Black, right? Or that you're three times as likely to be arrested for marijuana possession if you're Black than if you're white. Using data science to paint a picture of what's going on in a really clear way to try to convince lawmakers and the general public that we ought to change the law in ways we think are appropriate.
0:09:10.7 MK: Yeah. So that all sounds really exciting, I think, probably, for a lot of the students in this room today. One thing that we talked about earlier during lunch, is you were telling us a story about specifically facial recognition, which you just mentioned, and the opportunity that you saw to jump on this technology before it had become so entrenched. And I'm wondering if you could just tell us a little bit more about that story, that experience.
0:09:35.0 KC: Yeah, sure. So for a long time, those of us who have worked on issues at the intersection of technology and the law have... We did talk about facial recognition as like a future problem that it seemed like we always kind of thought someone else would solve. It's like, "Yes, in a few years, facial recognition's gonna be very powerful, and it's gonna be very dangerous." And at one point, I was like, "I don't think we can just say that anymore. We actually... We're the ones who have to do something about it. We should figure out what needs to be done, right?" And so got together with some colleagues, colleagues from Northern California and Seattle who also have programs similar to this at the ACLU affiliates there, and we spent the day thinking through, "Well, what do we think this solution ought to be? What do we think... If we could snap our fingers and pass a law right now, what would it be?" And we came up with the idea that... This was in 2017. We came up with the idea that we should ask governments to ban government use of facial recognition technology.
0:10:33.5 KC: And I remember when we started saying that publicly, people laughing at us. People would say like... Make jokes, "Oh, you're so ridiculous. You people are so stupid. This technology is here. It's going to be used, no matter what you do, in every conceivable context by every conceivable actor." This sort of like technological determinism that I think infects a lot of people in our society. And we proved them wrong, and I think... Obviously, we haven't solved the problem for the whole country everywhere, but we have made some important progress. And what was so key, I think, about that moment is that there were a number of things that were happening. One is that governments were starting to use facial recognition technology. It was... One of the problems in this work is that sometimes we'll go to lawmakers and say, "Here's an ascendant technology. We really think you need to regulate it." And they'll say, "Well, doesn't seem like it's really ripe. Not a lot of people are using it. It's not really a problem. Come back when it's a problem." And then, if we come back and we say, "Okay, this is it. A lot of police departments are using this technology. We really need to regulate." They're like, "It's too late. The horse is out of the barn." I'm like, "Are you kidding me?"
0:11:47.8 KC: So anyway, we found it was just kind of the right time where some police departments were using it but not really advertising that they were doing that. It was not in widespread use in government really, but starting to creep in. And so there were some examples we could point to as warnings, basically. The other thing that was really important is that Joy Buolamwini and Timnit Gebru at the MIT Media Lab had just published their groundbreaking research, showing that a bunch of facial recognition algorithms available commercially were exhibiting pretty serious race and gender bias, so just mis-classifying specifically Black women's faces at really alarmingly high rates, like one in three. Whereas, they were classifying white male faces almost 100% of the time correctly. And so we knew. We're not stupid. We knew the technology's gonna get better. It's definitely gonna get better in terms of addressing some of those race and gender disparities, but it wasn't at the time. And it was creeping into government use. And there was this nationwide, ascendant Black Lives Matter movement that was successful, I think, particularly at the time, at getting people to think a little differently and more critically about police, police power, maybe police technology. And so we thought, "Now is a really good time. We should do this now." So we did.
0:13:17.6 MK: Awesome. So what... Are there things that you see as ripe now? So you mentioned that there was a moment. Facial recognition was in this perfect moment to go after it really. Are there technologies that you are seeing in that space right now, or was that just sort of like a one-time deal where the stars aligned?
0:13:36.5 KC: I think we're still in that moment with facial recognition technology. And yeah, I guess I would just say that part of my job is thinking about new technologies, how they're interacting with civil society, how companies and governments are using them. Then the next part is thinking through what the law ought to say, what kinds of law reforms we ought to be fighting for. A part of it is also though, trying to figure out what might succeed based on what's going on outside of the realm of tech policy and technology generally. And so I think right now, what's going on is a nationwide attack on fundamental rights like reproductive rights. And people are starting to understand, in a way that they have not previously, how absolutely fundamental privacy is to basic human dignity, to the ability to control your own body. And so that, I think, provides us with a real opportunity to make headway in the law in states where people and lawmakers care about reproductive autonomy and reproductive justice.
0:14:51.1 KC: We can now say to lawmakers in Massachusetts, for example, "You might hear from the location data industry that there are all these good things that can be done with cellphone and mobility data, and that you shouldn't regulate that market or cut it off in any way. But don't you think it's a little concerning that the police in Texas or some person who wants to sue someone even, in a civil case in Texas, could just, with their credit card, buy access to every single person who travels to a Planned Parenthood in Massachusetts? Don't you think that's really troubling? Don't you think we ought to stop that from happening?" So there are political opportunities everywhere. And sometimes, frankly, they're in really awful situations like the one that we're in now.
0:15:37.8 MK: Yeah, right. And I think that there's a lot that I'm seeing too, about the way people's eyes are opening to... Both to the idea of privacy in general and to specific technologies that they might not have seen the concerns before, but now it's like, "Oh, so you know everywhere a car travels at all times, and when it leaves the state." And they're starting to understand in a way they maybe didn't a little earlier.
0:16:02.9 KC: Yeah, exactly. Like people, for a long time, have said, "Well, I don't really care about my privacy. What do I have to hide?"
0:16:08.6 MK: Right.
0:16:08.8 KC: Or, "I'm not doing anything wrong. Why should I care if the government is doing X, Y, Z thing?" And there are a lot of arguments. I've got a lot of arguments to deal with those claims. But people aren't really saying that anymore. I didn't hear that after the Dobbs decision. At least half of the country was like, "Oh, yeah, privacy really matters." It's fundamental.
0:16:32.4 MK: Yeah. Yeah, so let's talk about regulation and oversight. And on the local level, we're seeing this approach of community control, especially over police surveillance technology, right? So they're creating oversight boards or requiring transparency in purchasing and use. And I would love to hear your thoughts on what's working in that local control space and what is not. So here, recently Detroit passed a policy that a lot of advocates are actually pretty unhappy with. Advocates were fighting pretty hard for some oversight, and then what they ended up passing felt so toothless that advocates ended up saying, "No, we don't want it." Whereas, I know you were involved in passing a policy in Boston that I think has some teeth, and I'm curious about both what goes into that policy and also what goes into the advocacy to get a policy like that. You know, the small questions.
0:17:24.0 KC: Okay, so I forget who it was who said this, but I just read this great book by Astra Taylor about democracy called Democracy Doesn't Exist But We'll Miss it When It's Gone. And there's a great line in that book that's something like, "Life is a series of meetings." And this is the most real comment I've ever heard about what it actually means to be an active, engaged human being in the... A citizen in the society that you live in. You have to go to a million meetings. That's your life, but that's it.
0:18:00.0 MK: It's really true.
0:18:00.5 KC: And it's super annoying, and it's super boring sometimes, and people are annoying, and they drive you crazy, but that's what it is. And so the answer to the question is like, we've passed these laws. We passed three of them in Massachusetts, one of them most recently in Boston, that say it's not up to the police department anymore to decide unilaterally what kinds of surveillance technologies they're going to buy or how they're going to be used, what policies will dictate their use, et cetera.
0:18:27.0 KC: It is now the city council's decision, and the city council will make those decisions informed by the public. Well, sounds good. The trick is people have to be involved or it doesn't work, and so it's a lot of work. We're in the process right now of dealing with the first round of implementation in Boston. And there was a point at which, when I first got the 1200 pages of documentation that the administration sent over to the city council documenting all the surveillance technologies and all the policies and stuff, that I was like, "What have I done? Why did I do this?" But that's democracy. It's really hard work. So the answer is, it's better than the previous system where the police made all the decisions by themselves with no public input, in secrecy. But it really only works if people are engaged, and it's hard work. So I think the jury's out.
0:19:22.4 MK: Yeah.
0:19:23.5 KC: We'll see.
0:19:23.7 MK: Yeah, that seems like kind of a big weakness when you're demanding so much time and energy and engagement of the people who just wanna move safely through their communities or... You know.
0:19:34.5 KC: Yeah, I mean it's true that everybody should, right? You said you work on transportation. It's true on transportation too. The people who are in charge aren't necessarily gonna make the best decisions on behalf of all of us, and so we need weirdos like you and me to be obsessed with something so that we can kind of like fret over it maniacally, and get other people involved if we can, but we do what we can.
0:19:55.2 MK: Right, right. Yes. Yeah, and it is. It's so much time. It's so much time, and...
0:20:00.8 KC: Yes. It's a lot of meetings.
0:20:01.6 MK: And meetings, so many meetings. Alright, so then expanding out a little bit. We've got the local control, but then there are also attempts to do some national-level governance, or like in Europe, we've got the whole continent level with the General Data Protection Regulation, the GDPR. And we don't have something like that in the US right now. We don't have protection for everyone so that they can ignore privacy and just move through their life and have privacy. And I'm curious about your thoughts on the prospects. We've got this thing in Congress right now, moving through, the American... I wrote it down. The American Data Privacy and Protection Act. What... Are you hopeful about that? Are you skeptical? What are you thinking?
0:20:50.0 KC: Lot of mixed feelings about that bill. I'll say that the national ACLU's official position on that bill is about 30 pages long. It's not in favor or against. We have a lot of... There are a lot of things that we like about it. There are a lot of things that we think ought to be strengthened and improved. I'll just say two things. One is that there is preemption language in that legislation that would essentially block all states from passing laws to protect people's privacy in a way that goes above and beyond what Congress does. So that's called ceiling preemption. In other words, the law that Congress passes would be the ceiling for all consumer privacy law in the state... Or in the country, rather. And we have very serious problems with that at the ACLU of Massachusetts, for a number of reasons, one of which is that technology changes. As we know, Congress is not very responsive nor agile in terms of responding to developments in the tech space, or really anything human developments. And so I don't think it's a great idea to say that states have no... There's no room here for states to play in terms of developing, advancing consumer privacy law, and that it's strictly gonna be the purview of Congress and the federal government.
0:22:14.9 KC: The second reason is that there's some stuff in the bill that's really not that great. So it would be one thing if this was the best possible consumer privacy law that had a very, very strong private right of action allowing for individuals to sue, to enforce the law to protect their own rights. There is a private right of action. It's kind of weak. We think it ought to be strengthened a lot. So the combination of those two things makes it very difficult. That said, there are a lot of people in the advocacy community who support the law, or the bill rather, and think that it's important for Congress to do something. Because there are a lot of states that are never gonna pass any kind of consumer privacy law or if they do, it'll be a law that's even more pro-industry than the one that Congress has worked out. I don't think it's gonna pass this session. It's my... It's a pretty safe heuristic, generally, to just assume that nothing will come out of Congress. But I think on this issue in particular, Nancy Pelosi, obviously, is a very powerful person in the house, the most powerful person, and has heard from state law makers in California where she comes from, that they would be very upset if this bill passed with its strong preemption language. So I think it's probably dead for this session.
0:23:44.0 KC: The real problem as we know with so many other issues, is that big tech companies have way too much power in DC, and law makers are making decisions that do not reflect the public interest and instead reflect the needs and demands of a tiny, tiny fraction of people in Silicon Valley. So that's a problem.
0:24:08.2 MK: Yeah. So then Europe is a little bit farther from Silicon Valley. They've had the GDPR in place for a few years now. How is that working out? Does that feel like a model that we would want to emulate? Are there lessons that we can learn from it to do better, maybe?
0:24:24.4 KC: Yeah. So I have a lot of complicated feelings about the GDPR. The GDPR basically uses like a consent model. So you've probably seen this when you log on to websites. There's a little thing at bottom that says "accept." Or maybe there's another button, and if you click that, it might say, "I deny all cookie tracking," or "only accept required cookies," or whatever. So the idea is that the GDPR puts on each internet user... They would describe it as giving people control over their own information, I kind of describe it as putting the responsibility of dealing with all that stuff on individual people. And I think at the beginning when the GDPR was first being worked out, a lot of us felt like, "Yeah, this is right. This is great. Privacy is about control. This is really effective. The GDPR also gives people the ability to request their data from big tech companies, to ask that it be deleted."
0:25:23.4 KC: The problem there, obviously, is that the vast majority of people are never gonna do that. And the vast majority of people just click whatever to get to the web page that they're trying to access because the vast majority of people don't really know anything about this stuff and don't care, not because it doesn't affect them but just because they've got other pressing stuff or because they're like 17. You know what I mean? And most 17-year-olds are not super concerned about data privacy. I certainly wasn't when I was 17. So I think that's a real shortcoming of GDPR and consent model legislation. I think an area that is ripe for exploration in the United States is thinking about following Illinois' model which is to put in place really, really rigid and strongly enforceable privacy protections around, particularly sensitive types of data.
0:26:21.6 KC: So Illinois has one of the best privacy laws in the world. It's called the Biometric Information Privacy Act, and it requires that private companies, Facebook, whatever, Walmart, get your affirmative opt-in consent. So that's not opt-out. That's not like click a button to use the service. It's actual meaningful consent before collecting your biometric data, so your voice print, your face print, whatever. And these models, I think, are gonna be more effective because it's not so much like a click to use as it is meaningful consent. The other model I like is just banning certain practices. Instead of saying that a company has to get your permission by clicking something, to collect and sell your cell phone location data, we should just ban the selling of cell phone location data because I don't think that's acceptable, really, under any circumstances. So yeah, thinking about ways of treating particularly sensitive types of data differently, that, to me, is like low-hanging fruit. There are a lot of other really complicated problems that we could spend a lot more time talking through, but that is one that I think we should all be taking a lot more seriously and working on. I'm going to do that.
0:27:40.9 MK: Awesome. To get to bans, it seems like we really... This is where we start to be thinking about advocacy and the role that regular people have in pushing for changes around the way we govern these things. So I know we've got some folks listening today. I don't know if anyone's in the room today, but I know we have people listening who are in the midst of organizing resistance against automated license plate readers, specifically in Ypsilanti Township, which is our neighboring township. Ypsilanti Township fully surrounds Ypsilanti City. Ypsi Township is generally wealthier and whiter. Ypsi City is generally poor and blacker. And the idea would be that there would be cameras at every entrance and exit to the township, which would mean fully surrounding Ypsi City with cameras managed by the sheriff's department.
0:28:30.0 KC: Whoa.
0:28:30.8 MK: This is a thing that's happening right now, that they're talking about doing. And so license plate readers have been around for a long time, actually. There was an ACLU report from like 10 years ago. But they're becoming much more popular. I think it's partly that there's a new way of monetizing them that's out there. And what you were talking about earlier with data storage getting bigger and everything else getting smaller, it's much easier to do. So I thought that these might be a good example for us to start talking about how local advocacy around some of these issues can work. And we've been talking about privacy and data, but when we get into specific technologies it can sort of feel like Whac-A-Mole. But I know you've been successfully involved in some fights against stuff like this, and so I'd love to hear your thoughts about what works.
0:29:21.6 KC: Well, I think it won't always work to just say we wanna work with the elected officials that we have right now and do something that is dis-favored by the police or another influential, powerful actor in local government, right? It's a prerequisite to getting good things done in local government to have good people elected in local government. So I would say that if that condition hasn't been met, then there needs to be some homework or prerequisite courses taken in fixing that problem and getting some good people elected to represent you in local government. And once you have that, it's a lot easier. It is very difficult, and very similar to banging your head up against a wall, to try to organize to convince people who do not represent you, who represent a different constituency, right?
0:30:27.5 KC: Boston politics is a good example of how things have changed quite a bit in this way over the years. 15, 20 years ago, the Boston City Council was almost entirely white. Well, it was entirely white, I think. Was mostly somewhat conservative people who were very pro-police, would never do anything the police didn't want done. And Ayanna Pressley ran maybe 15 years ago. She's now a congresswoman. Became the first woman of color ever elected to the Boston City Council. And now the Boston City Council, in a very short time period is like half women of color, and it's a lot of people who are very willing to challenge the power of the police. So maybe that's not the answer that you're looking for, but I really would not waste my time trying to organize in a community where it's a bunch of elected officials who do not care what you think. And you can organize as many people as you want to show up to community meetings and council meetings, unless you get someone to run against them, they're not gonna... And you can mobilize an effective coalition that actually scares them, you're probably not gonna convince them to vote your way. So yeah, people need to get involved.
0:31:40.1 MK: Yeah, at every level. And this is where things like serving on commissions and volunteering for boards can make a difference in the long run because you start to learn about the inner workings of the city, which seem very arcane and boring and involve lots of meetings. But that's where these decisions get made, right?
0:31:56.2 KC: That's right, yeah.
0:31:57.5 MK: Yeah.
0:31:58.4 KC: Yeah. But then once you have that, there are a lot of things, yeah.
0:32:00.1 MK: Yeah, yeah. Okay, so let's imagine. Pretend that we have those people.
0:32:03.0 KC: So if you have a decent decision-making body, you find out what's going on to the degree possible using public records requests, partnering with journalists. There are a lot of great investigative journalists who care about these issues and will do their own investigation with you. Try to find allies in strange places. Sometimes there are. Surveillance is an interesting issue because it's one where... Typically, people on the left and the right are suspicious of police powers, of government powers, and are more likely to want to impose some rules or take certain tools away from the police. It's really people in the middle who tend to be the ones who are the most friendly to state power or police power generally. So strange bedfellows are always good. It's always good, in a coalition, to be able to say we have people from across the political spectrum as a part of this effort.
0:33:05.4 KC: It's, I think, important to have a multi-generational coalition. That's another thing I would say is that working with young people is really important. It's also really important to work with older people. Not a great idea if your meetings are people who all look the same, and that's not just race, it's also age and different parts of the city or the community or whatever. Not only is that helpful because people bring different ideas to spaces, but it's helpful because you are all representing different political constituencies that politicians care about. And then figure out your talking points. There's some basic campaign 101 stuff that needs to be done. What's the most effective message, I think, is a really, really important piece of organizing, and finding some people who can convey that message, finding people who wanna be a part of the movement who also have a story to tell that's compelling about why this issue impacts them.
0:34:09.4 KC: People are typically not really that moved by super-academic arguments about why something is problematic, but are much more likely to care about something like facial recognition, for example, if the story is being told by Robert Williams, a man from Detroit who was wrongfully arrested because the police used facial recognition in a very sloppy way and arrested him. It's extremely helpful to have people like Robert involved in movements, and at the table determining the strategy and the messaging and things like that. So yeah, I mean there's a lot more to say, but those are some thoughts.
0:34:42.9 MK: Those are all great. That's fantastic. Alright, so we've been talking a lot about fighting things. But a lot of the students in the STPP program are interested in going to work in tech, are learning... Are engineers of various flavors and are thinking about how to build things, and build things that are going to move us towards justice. And so I'm curious about what... A little bit about the role of technology that you see in advancing positive agendas like the Data for Justice project or other similar projects.
0:35:15.4 KC: Yeah. I think information, when it's publicly available, and it's not controlled by profit-seeking entities that wanna hoard it and use it in ways that could hurt people, can be really powerful in positive ways, and that there's a growing movement of organizations and institutions like the Ford Foundation that want to fund public interest technology. So I didn't know this until I got involved with the Ford program, but in the 1960s, the Ford Foundation funded similar fellowship programs for law students and early-career lawyers, and it's essentially established the public interest legal profession. I wasn't aware of this. I always thought that it was just like a naturally occurring thing, but it wasn't. It was purposefully created by Ford. The idea was people coming out of law school at that time thought, "There are only really two career paths for me. I could go into big law, or I can work for the government." And the Ford Foundation thought, "You know who could really benefit from lawyers? Poverty organizations, housing groups, education rights groups, groups that are working in the public interest, non-profit organizations."
0:36:35.3 KC: But it was hard for them to convince these really cash-strapped organizations that they should hire lawyers. "Well, what the hell do we need a lawyer for? They don't know anything about poor people or whatever." And so they funded, the Ford Foundation funded this fellowship program of placing law students, early-career lawyers, with non-profit organizations. And now we live in this world where it is a very well-established career path for lawyers to go into public interest legal work. And so they're essentially trying to do the same thing with technologists. And that's actually part of the reason why we started the Data for Justice Project. We had a technologist that was a fellow paid for by the Ford Foundation, and it was really great. She worked on litigation with our legal department, was instrumental in helping us develop a very strong database case for legalizing marijuana recreationally in Massachusetts in 2016.
0:37:30.1 KC: And so, yeah, we were convinced. And so we've had a data scientist on staff at the ACLU ever since. The national ACLU also has a whole analytics team at this point, like, maybe a dozen different people working on data science across the organization, and I think increasingly other groups like ours are realizing how central it is to have some nerds on our side, you know? Not just on the other side. So, I would look into that. If you're interested in the kind of stuff that I've been talking about and in other civil rights issues, there are organizations that are looking for people who have really technical skills to join the fight.
0:38:10.8 MK: Awesome. I think this is a great moment to move it to audience questions. So as a reminder, for people who are watching at home, you can send your questions to [email protected], and for folks in the room, you can write them down on the little note cards and Annabella is gonna come around and start bringing them up to Ben. And I think you probably have some questions already from the registration form that can get us going. Yeah?
[background conversation]
0:38:44.3 BG: Cool. Well, this has been a really fascinating discussion. A lot of questions already from some of the registrants, and of course from myself coming in. So, one question that comes up... And you've talked about this legislation that's been passed, is to really think about, what is the aftermath that happens after the legislation gets passed? And two directions that come up there is, one, how do you know that government agencies are actually following through on those laws and not carrying out operations in secret to avoid being caught by them? How do you ensure that the laws are being followed? And then in a similar sense, one of the recent sets of stories, especially from the past six months, have been certain cities like San Francisco passing certain retrenchments of facial recognition bans. So how do you think about... There's this moment of getting a policy passed, but there's also the continued advocacy to ensure that it's not chipped away at as that political moment or political incentives change?
0:39:49.1 KC: Well, like I said, life is a series of meetings. [laughter] So, to the first question... Wait, what was the first question again?
0:40:00.3 BG: First question was about, how do you know that the government is actually following the laws?
0:40:01.6 KC: Right. You don't. Yeah, you don't. The police in Boston could very well have a whole wing of secret surveillance technology that they're not reporting to the City Council. The reality is, in this country, we don't have any effective oversight of the police, period. There's none. The courts don't do it, they do it very sporadically. I feel like the defense bar is the closest thing that we have to actual police oversight in the United States because the courtroom is... It seems to be the one place where police feel some obligation to provide information. They still try to resist it a lot, but are sometimes required by courts to provide information that they really don't wanna provide. And through criminal cases, stuff often leaks out that they wish hadn't leaked out.
0:40:57.7 KC: So a great example of that is the use of stingrays. These are devices that trick cell phones into communicating directly with the spy police technology called a Stingray or IMSI Catcher instead of communicating with your cell phone company, cell phone tower. And so, for a long time, the FBI was helping local police departments acquire and use Stingray devices, and the FBI even had a confidentiality agreement that they would require that local police and prosecutors sign, that said, believe it or not, "If a defense attorney asks about whether a Stingray was used in this case, drop the prosecution," because we would rather that this... The existence of this surveillance technology not be mentioned in court, than actually convict this murderer or accused bank robber or whatever. So I think that speaks exactly to the question that you're asking, which is that we don't have any way of knowing and it's likely that they are lying about some things and we'll find out at another time and then people will be mad about it. I don't know, [laughter] we do the best we can. [laughter] Yeah. At least then we'll be able to say, "You violated the law," and sue them, so... Yeah. And then the second question... Sorry, remind me again.
0:42:23.3 BG: Retrenchment of facial recognition bans.
0:42:25.5 KC: Right, retrenchment. I saw what happened in New Orleans, which was like a bit of retrenchment and in San Francisco, there was... There's been a very, I might even say, hysterical campaign in San Francisco to convince voters that all of their problems have to do with the former DA, Chesa Boudin, who was a lefty prosecutor, and that giving the police access to people's private surveillance cameras in their homes is going to magically make homeless people disappear. Those are like political problems that don't really have anything to do with surveillance or technology and are much more about, I think, how the failure of our society to adequately provide for people's basic needs is leading to a real crisis where wealthy people in San Francisco don't wanna look at poor people. And that's kinda what all this is about. It's not about technology. So it's a bigger problem. [laughter] Yeah.
0:43:33.8 BG: Absolutely. And so one thing that that discussion connects to then is the broader ubiquity of some of these technologies like facial recognition and cameras in ways that are often targeted not to the government, but to exactly these types of, say like, wealthy people in San Francisco that you're talking about, whether that's facial recognition on your phone or the Amazon ring cameras on front doors. How do you think about not just the police use of technology, but how this technology infiltrates all aspects of daily life? And then what should an individual's relationship with that tech be as a consumer who maybe finds it really helpful or comforting to have those technologies in your house or on your devices?
0:44:20.7 KC: Well, I'm gonna tell a story because I think you'll remember this and probably tell other people. Anybody follow basketball? Big fan of the Boston Celtics here. We almost won the whole thing last year. It's really sad that we didn't. Speaking about San Francisco. But anyway, our head coach, former head coach Ime Udoka, incredible coach, just got in a lot of trouble. Last week, it was revealed that he has essentially been fired because apparently he was sleeping with the vice president of the Celtics organization's wife, and they did not look kindly upon this. She actually worked for the organization as well, so it was like a workplace harassment situation. And it was revealed last week by a sports reporter that the woman's husband, who is the Vice President of Operations or whatever at the Celtics, first became aware that Ime was sleeping with his wife when he heard them having a conversation. She was on the front porch, and he heard through the ring doorbell camera, this like lovey-dovey conversation that she was having on her way into the house. So I would just say that you're bringing some danger into your life if you willingly subject yourself to surveillance through companies like Amazon's ring doorbell system or whatever. And there's convenience, certainly, there's...
0:46:04.8 KC: People think it's funny that they can see whatever walking down the street when they're not home, it may give you a sense of safety, but it could really change your life in ways that are not positive, so... Yeah, be careful, I guess. [laughter] Be careful what you wish for. I don't think that there's anything that we can do about that as a society. If people want to spy on themselves and open up their lives to the rapacious desire for Amazon to collect every word that's ever been spoken and every image that's ever been available to human eyeballs, then they're gonna do that and we can't ban it under the law. The first amendment pretty broadly protects people's ability to create video, and I think it's kind of like a social thing that we ought to try to work on. So I try to very gently encourage people not to adopt those technologies at home, but mostly people don't listen, so [laughter] now I'm just going to tell them the Ime Udoka story, [laughter] see if that changes their minds. We'll see. Although probably a lot of you will be like, "Oh, I wanna know if my wife's cheating on me, I'm getting all those things" [laughter] Not the moral of the story. [laughter]
0:47:31.5 BG: You walked into that one. So yeah, I really appreciated your real talk about the mundane aspects that go into your job. From the outside, it sounds really glamorous. I've seen you interview Celtics' player, Jaylen Brown, you've done articles about you in the New York Times and other places, but also...
0:47:56.1 KC: That was a highlight though, the Jaylen Brown thing was a real highlight. It doesn't happen everyday.
0:48:01.4 BG: Yeah, that was very cool. But alright, so from the outside, this flashy, exciting life with these great wins, but the day-to-day is just a lot of meetings, a lot of conversations that are often very frustrating. And so I'm curious, what was your path into this work? And so when you said you weren't excited about technology or privacy when you were young, how did you get into this work and how do you think about what makes this job exciting for you despite or because of all of these meetings that you have to do? That sort of balance between the longer term things you're fighting for and the day-to-day challenges.
0:48:38.4 KC: Meetings can be fun. They can be really difficult, but I actually do like talking to people. So if you don't like talking to people, don't get involved with politics because it's a lot of talking, [laughter] it's mostly talking. But I came to this work, like you said, not through studying technology. I did do some STS work in college, I was interested in the history of technology, but I studied history and actually came to this work through thinking about power really, and thinking about, like I said, asymmetries in power. I've always been interested in social movements and in democracy generally, and got a very part-time job working at the ACLU of Massachusetts in late 2009, analyzing thousands of pages of documents that the prior legal director had obtained by suing something called the Commonwealth Fusion Center, which was a spy center essentially established in the wake of 9/11 by the federal government at the Massachusetts State Police.
0:49:45.9 KC: And the purpose of this... There's one in Michigan, I'm sure you might have two even. They're all over the country, run by state and major metropolitan police departments. And the purpose of these places is to facilitate the sharing of information, intelligence, they would call it, among state, local and federal law enforcement, and it was the response to one of the recommendations in the 9/11 Commission Report after the 9/11 attacks. Congress had this big, long process, they gathered a lot of information, they wrote a big report, and one of the findings in the report was that there was a failure to connect the dots between state, local and federal law enforcement, and this was one of the things that led to 9/11. That's not true, it really had nothing to do with local police at all. The failure to communicate was between the FBI and the CIA about the hijackers that were in Los Angeles. But whatever, it doesn't matter that it's not true, it developed a life of its own. It was part of the reason why the Department of Homeland Security was created, and then DHS spent billions of dollars funding the creation of these Fusion Centers all across the country, and also funding state and local police to acquire and use new technologies like license plate readers and drones and electronic fingerprint readers, that's basically what's paid for all of the surveillance camera networks in every metropolitan area in this country is Homeland Security money.
0:51:05.7 KC: So anyway, I started looking into this in a very part-time way, and I was fascinated because none of this was in the newspaper. None of it, at all. There was no place you could find information about Fusion Centers in the press at all, even though they had existed at that point for six or seven years in some places. And so I just got... Yeah, I got really interested in how the police were using new technology, how they were accessing database systems created by corporations like LexisNexis. We don't often think about police when we think about legal database like Lexis, but the police are using those big data broker systems to access information about everyone, not just people who are suspected of crimes. And that really got me thinking about, like I said at the beginning, these power asymmetries that I've now explored for the rest of my life. [chuckle] Yeah.
0:52:01.6 BG: So thinking about the audience here, both in terms of their relationship to technology and for the students at all levels, their paths both within the University of Michigan and beyond, what are the key things that you recommend for individuals in terms of how they can protect their privacy or how they should think about what privacy means to them or in their communities? And for students who are interested in having some more knowledge or impact in these areas, what do you recommend in terms of courses or majors or types of ways to get involved in these issues?
0:52:42.1 KC: Great questions. Well, what can people do individually? There are privacy preserving technologies that everybody should be using. Probably, the best one is Signal. So who in here has heard of Signal? Anybody? Great, self-selected group. [laughter] For those of you who don't use Signal, you should absolutely download it and use it. It's an encrypted messaging and communications app that works on Androids and iPhones. It's totally free. The only thing about Signal is that it only works if the person you're communicating with has also downloaded the app, so just get all your family that's on WhatsApp to switch to Signal. [chuckle] No problem, right? I pretty much only communicate with people on Signal. My friends and family and co-workers, and it's even better than iMessage, actually, 'cause you can use way more emojis to react, just so you know.
[laughter]
0:53:38.3 MK: It's so much better.
0:53:38.6 KC: So you should get with Signal. [chuckle] It's really fun. And to use more secure... To protect your online communications and stuff, you can use encrypted email. I would just recommend that you consider email to be public. That's... I think it's just a better strategy, that if you're gonna use... If you really need something to be private, to use a communications platform like Signal, recognizing that the only way that anybody else could read those messages is if they had your physical phone or the physical device of the person you're communicating with. Also, turn on disappearing messages 'cause that way, probably won't be there anyway for anybody to find, if they get your phone or the person you're communicating with, their phone. But yeah. So Signal... And what was the other thing I was just saying? Sorry, I got lost.
0:54:26.7 MK: Email.
0:54:27.8 KC: Oh, yeah. Email, again, I would consider public. We have a rule in my office, which I think is really smart, which is like, "Don't write anything in an email unless you wanna read it on the cover of The Boston Globe." So we try to adhere to that. [chuckle] Some of you who work for this institution, your emails probably are public actually because they're subject to the Public Records Law, yeah, but there are encrypted email services like ProtonMail, if you really need something like that. If you're into technology, you could actually use your own encrypted mail, but... And then... Sorry, repeat the second question again.
0:55:14.6 BG: The second question was about career paths and recommendations for students who wanna get more involved or knowledgeable about this space.
0:55:23.4 KC: Oh, yeah. Yeah, I'm not aware of all the classes that exist in this enormous institution, but I'm guessing that there are a lot of them. I think that the purpose of undergraduate education is to teach you how to think and to teach you how to contextualize your own existence. And so in order to do that, I think you need to learn some history. So I would recommend that you take some History classes. One of the benefits of learning history is that you'll realize that things have always been horrible for human beings.
[laughter]
0:56:06.4 KC: And it helps to put some perspective on the world that we live in today, even just in this country, like 60 years ago, we lived in an apartheid society that was a formal apartheid society, right? Actually, we had an ACLU dinner a couple of weeks ago, and Andrew Young, the former ambassador, who is a very important civil rights figure in this country, spoke at our dinner, and he was like... It was really amazing to hear from him because one of the things that he said was, "You know, I know that you guys are really worried about what's going on in this country, but have you ever had the shit beaten out of you by the Ku Klux Klan?" You know? Things have been bad in this country for a long time, and you're gonna be okay. [chuckle] Like, get up and fight. And so I think learning about history is really important because it can help you understand why we're here, how we got here, how the tech industry became what it is, right? And then, yeah, I guess that's all I really have to say about that. Yeah, I'm sorry I don't have any better advice. [laughter]
0:57:13.1 BG: No, I think that's really wonderful advice and a great note to end on. Unfortunately, we're out of time. The audience and I have many more questions we'd love to ask you, but that brings us to the hour. So thank you so much for being here, Molly. Thank you for being in conversation and raising such fascinating questions, and thank you to everyone here in person and online.
0:57:36.8 KC: Thank you.
0:57:37.0 MK: Thanks so much. Thank you, Kade.
[applause]