
Signal and Noise: The New Administration, Privacy, and Our Digital Rights with Cindy Cohn of Electronic Frontier Foundation
Signal and Noise: The New Administration, Privacy, and Our Digital...
Read NowGet an overview of the simple, all-in-one data privacy platform
Manage consent for data privacy laws in 50+ countries
Streamline and automate the DSAR workflow
Efficiently manage assessment workflows using custom or pre-built templates
Streamline consent, utilize non-cookie data, and enhance customer trust
Automate and visualize data store discovery and classification
Ensure your customers’ data is in good hands
Key Features & Integrations
Discover how Osano supports CPRA compliance
Learn about the CCPA and how Osano can help
Achieve compliance with one of the world’s most comprehensive data privacy laws
Key resources on all things data privacy
Expert insights on all things privacy
Key resources to further your data privacy education
Meet some of the 5,000+ leaders using Osano to transform their privacy programs
A guide to data privacy in the U.S.
What's the latest from Osano?
Data privacy is complex but you're not alone
Join our weekly newsletter with over 35,000 subscribers
Global experts share insights and compelling personal stories about the critical importance of data privacy
Osano CEO, Arlo Gilbert, covers the history of data privacy and how companies can start a privacy program
Upcoming webinars and in-person events designed for privacy professionals
The Osano story
Become an Osanian and help us build the future of privacy!
We’re eager to hear from you
Digital rights, privacy, and government policies have been a hot topic over the past month as the Trump Administration comes on board. But the truth is, data protection and safeguarding our freedoms are not partisan issues. Regardless of what party is in power, we need to be vigilant about our digital rights and never give up the fight to protect them.
Cindy Cohn, Executive Director of The Electronic Frontier Foundation, discusses the evolving landscape of digital rights, privacy, and government policies impacting technology. With a career dedicated to defending civil liberties in the digital age, Cindy shares insights on encryption, AI governance, surveillance capitalism, and the role of regulatory frameworks in shaping the future of the internet.
Arlo Gilbert: [00:00:00] Hi everybody, this is Arlo Gilbert, co founder and CEO of Osanho, a leading data privacy management platform, and you are listening to the Privacy Insider Podcast. This show explores the past, present, and future of data privacy for privacy and business leaders alike, as well as anyone who wants to keep privacy top of mind.
Welcome to the Privacy Insider Podcast. My name is Arlo Gilbert. I'm the CEO and co founder at Osano, and today I'm your host. With the inauguration a few weeks ago, we've already seen a few tech related developments. Trump issued a new AI executive order that places great importance on business and innovation.
We've already seen some interesting announcements and alliances in big tech, including from Meta, Apple, and OpenAI. And we've seen renewed concern over Chinese companies, from TikTok to DeepSeek, and how they [00:01:00] use our data. What does it all mean? What's real? What's noise? And what's the most important to us as privacy pros and as citizens?
As you know, I don't like to make big predictions, but I love to get insight from people who know more than I do. And that's why Cindy Cohen is here today. I don't think anyone knows more about our digital rights and how they relate to government, global conditions, and big tech. Cindy is the Executive Director of the Electronic Frontier Foundation, also known as the EFF.
She's long been at the forefront of defending civil liberties in the digital world, including the right to data privacy. She and the EFF have taken on the NSA, the Patriot Act, and have long opposed the practice of surveillance capitalism. And now, Cindy is working on a book that will come out next year.
She's here to put it all into context for us. [00:02:00] Cindy. It is great to have you here. Thank you. Well, welcome to the show. We'll go ahead and kick things off. We've got an exciting time in the world of data privacy and surveillance capitalism. Uh, new administration has come in, but before we start talking about the nuts and bolts of politics and the world, it would be really interesting to understand a little bit more about you and the story behind Cindy and how you got.
into data privacy and Rod, uh, fighting for people's rights to, uh, keep their data encrypted. Um, how did you get here?
Cindy Cohn: Well, I kind of fell into it. If you want, if you want the truth, I, you know, I, I'm one of those people who always had the idea that, you know, I was kind of put here on this earth to help. Um, and, you know, some of that's the family that I grew up in.
I grew up in a very small town in Iowa. [00:03:00] Um, and we were the, some of the only Jewish, uh, folks in town, um, and that kind of gave me an outsider's perspective and, and thinking about, you know, what You know, where you live in a society that is open to everybody and, you know, the great thing about America is that you don't have to be just like everyone else.
But we also know that the experience of not being in the majority, um, requires a little protection. It requires a little thinking about it. So, um, I, I started thinking about, you know, like how I could help other folks. I was also adopted. I had lots of, you know, historical reasons why thinking about what it's like to, um, not fit in very well.
is kind of deep in my, in my core. Um, and you know, I got interested in human rights. Uh, when I was in college, um, and I actually worked at the United Nations, uh, on human rights when I finished law school, uh, and then I moved to the Bay [00:04:00] Area and I met a bunch of crazy hackers. And I started, this is in the early nineties, so this is, you know, before we had a World Wide Web.
Um, but these people were doing the things that I think most of us take for granted now, like, you know. Living on one coast and working for an organization based on another coast and stringing ethernet wire and having long digital chats with each other. And, and, you know, distance didn't really make a difference in terms of your access to people and information.
And I became fascinated with the possibilities for, you know, our, our world, when everybody got to use this technology that, that my friends were. And, you know, one of the people who I met early on is John Gilmore. Who is one of the founders of EFF, and he was already thinking about what would we need to have privacy in the digital age.
There was a group called the Cyperpunks. If you're kind of old school, you might have, you might've heard of this. Um, and they were thinking about, you know, [00:05:00] not just the tools, the way that everybody could enjoy the benefits of this world, but what are the things we were going to need to put in place to protect people's rights and civil liberties?
And they asked me. To do a lawsuit, um, about protecting encryption technologies. Cause at the time encryption technologies were controlled by the U. S. State Department. They were on the list of weapons along with like surface to air missiles and tanks was software with the capability of maintaining secrecy.
And, you know, there's reasons why the government thought of, of cryptography as its own, but we knew, John knew and convinced me and many other people. that this was gonna be an important technology to protect people's privacy in the digital age. So I signed on, uh, to help, uh, and you know, uh, you know, we, we ultimately won, right?
The, the reason that people have cryptography available pretty easily, uh, and, um, to protect our data, whether it's at rest or [00:06:00] a transit is, is in part because of the work that I and other people did in the 90s. So again, I kind of fell into this. I was an English major. Um, uh, but I've always been interested in how, how we build, you know, a society that both empowers us and protects us.
Arlo Gilbert: I love that. I mean, nobody, nobody ever seems to get into these freedom fighting roles on purpose. It's always something that people kind of accidentally stumble into. So Thank you. What does the EFF? I mean, maybe you can tell us a little bit about the EFF and what it does, because, um, it's, it's a lot more than data privacy, right?
I mean, how does that work with? Uh, the data privacy pros are doing, how does that intersect with the EFF? Because a lot of people do think of you and the EFF as being encryption focused. And I mean, those are the things, those are the fights that you got most, uh, the most notoriety for. [00:07:00] Um, but tell, tell us a little bit about the EFF and its, its origins and, and how it intersects.
Cindy Cohn: Yeah, I mean, EFF is, at this point, we are 125 people. We are a very big, we're the biggest and the oldest digital rights organization. And in part, that's because. People like John Gilmore and Mitch Kapoor and John Perry Barlow, the three kind of founders, Steve Wozniak was involved early on as well, really saw the need to have an organization that was going to stand up for people's rights in this new world, this new digital world that was, that was coming.
And, um, so they founded EFF in order to do that. I often, you know, people. People often say, Oh, the early internet people were really starry eyed about how good things were going to be. And I always point out that like, you don't found a civil liberties organization. Do we think everything's magically going to be okay?
That's, that's a bit of misreading of the history to try to, you know, pause it backwards and think that nobody thought there might be fights for people's rights or, or, or unjust things [00:08:00] happening online that that's not true. In some ways that kind of erases me and EFF from history. And that, and part of the reason I wrote a book was kind of correct the record a little bit, because this seems to be a cultural story we're telling ourselves about the early internet.
And I'm like, well, Hey, I was there. And that's not actually how, you know, that you don't, again, you don't create an organization like EFF if you think everything's magically going to be okay. So what are we, we are a bunch of, you know, kind of the biggest piece of our team is lawyers. Impact lawyers who go to, go to court or advocate in, you know, other areas for users, for users of technology more broadly.
Um, and, uh, so we have a bunch of lawyers. We have a bunch of kind of, um, lobbyists might not be the right world, but people who focus on the, the, the congressional branch of government. Um, some of them are lawyers, some of them are not. Uh, we, but we also do something that, uh, no other group does, which is we have [00:09:00] a very large group of technologists.
Um, who advise us, who make sure that we're right, and we're really grounded in how the tech actually works. Uh, sometimes in policy spaces, they get a little divorced from how the tech works. And we are, we're, we're grounded in terms of what we think about it, but also the people that we have. Um. And then we have activists, people who, you know, write our blog and, and make sure that things get out on social media and, and run campaigns.
So, you know, for, we're fighting something in Congress, we're not just in the halls of Congress advocating, but we're out there in public trying to, to talk to people. So these three branches, law, advocacy, and technology are kind of the, the three pillars of what we do. And I think that, that what makes us really strong is that we bring all three of those things.
Arlo Gilbert: So, you know, you opened, you opened the Pandora's box here. You mentioned that you're writing a book. Tell us about this book. [00:10:00] I mean, I know I'm gonna read it. Um, and I'm sure a lot of the folks listening today would enjoy it.
Cindy Cohn: Well, I mean, again, there's a couple of reasons for the book. One of them is to correct the record.
Um, I feel like sometimes the early internet, especially the first 20 years is kind of only talked about in terms of guys and girls. Companies, you know, we think about the early internet and people think about, you know, Steve Jobs and Bill Gates and, you know, like as if we were looking at history and the only people we talked about were the, you know, the railroad barons, right?
And that's the only thing we talked about, about the early, you know, early America. Like that isn't true, um, but it's not the whole story. Um, so I, I felt like if I was going to complain about this and, uh, you know, I kind of privately was complaining to my friends that like, I was there in the 90s and, you know, those guys were around, but they weren't the whole story.
And, you know, why, why is the story all told about guys and, and their companies? Um, uh, and I was like, well, maybe I should put my, you know, [00:11:00] keyboard where my mouth is and tell the story about what it looked like from where I sat. Um, so, so that's one of them. And, and, you know, hopefully try to inspire more people to think that this is something you can do, that if you care about technology and you care about rights.
You know, going into, you know, the corporate side might not isn't the only path. Um, there is a huge, actually, I would say at this point, you know, for me, kind of stunning since since we were the only ones when things started a huge digital rights movement. There's lots of people. who are able to devote their time and energy to this.
So I was hoping that my story might inspire, uh, more people. I'm, you know, pretty much always recruiting people into the, uh, idea or rights as a, as a career that you can do and something you should put your talents into or a volunteer. Um, so, you know, I wanted to correct the record. I wanted to inspire a few people.
And there's some stories that I think need to be told about how important privacy is and how this fight is as old as, really as old as the web, even older, about [00:12:00] protecting people's rights. So it's a professional memoir. It tells three stories of three of the big privacy cases that EFF's been involved in that I've had the, the, you know, pleasure of leading.
The first one is the cryptography wars, the cryptography fight in the 1990s. Uh, the second one is the NSA's mass spying programs. Um, and the third one is a, is a fight around national security letters, which, which is a, which are these secret subpoenas that I suspect privacy professionals in your community are more aware of than a lot of other people.
But the government after, uh, in the Patriot Act gave itself the power to issue secret subpoenas, uh, to tech companies, um, and telecommunications companies demanding information about their customers and gagging them forever. So that they could never tell anyone that this happened. Um, and this has profound implications, not only for the privacy of the people involved, but because, you know, Congress periodically is supposed to look at these programs and decide whether they're good or not.
And it [00:13:00] really gagged the people who had direct experiences. Um, from being able to participate in the political process and tell Congress about the problems and that directly happened to some of our clients. So, um, so those three stories are the three that I tell. And then I weave in some of my personal history as well, because I think.
Um, if you're gonna convince people this is a good way to spend your life, if you're just talking in the, you know, in the realm of the, you know, the, you know, kind of the analysis or policy level thing, it, it, it, it's not engaging at the level that I wanna, that I wanna engage in. I think there are a lot of people who analyze privacy and think about privacy, and some of them are really amazing, but I spent a lot of time doing it, and I think that that's an important piece of the story.
Arlo Gilbert: You know, it's funny you mentioned the telecommunications things. I actually owned a phone company, uh, in the mid 2000s. We were a voice over IP provider, but we were also a CLEC. Uh, I remember [00:14:00] getting some of those, uh, notices from the government where we would have to turn over subscriber information and, uh, yeah, we were absolutely banned from talking about it.
And it was very weird. Um, so, all right, I mean, let's, let's talk a little bit more about, you know, the kinds of policies that you guys are focused on. Um, you know, so for example, you guys write, uh, you wrote a white paper, uh, about privacy and civil liberties in 2023 that. That really aligns with our ideals and guidance on the subject.
And, um, you know, I, I won't bore everybody with all the details of it, but privacy first regulations and practices are the best way to prevent most of our online theories from happening. Um, so whether that's exploitation of children, AI misuse, surveillance capitalism, 2025 has already been off to a pretty wild start.
So is there anything that you would add or amend to what you wrote in the 2023 paper? I mean, [00:15:00] the world's already changed.
Cindy Cohn: Yeah, it's changed a lot. I would say, you know, we wrote the paper because we kept seeing that in the cross set of issues that EFF works on. We were seeing really bad ideas floated to get at problems that we could solve if we thought about privacy first.
Um, and we, so we saw this kind of across the board. Again, my intellectual property team saw this in the context of I, of IP fights around AI. Um, my, you know, my, my straight ahead consumer privacy things, uh, team saw this, my team that was working on trying to, you know, protect free speech saw this in the context of, uh, you know, lots of proposals to try to, you know, weaken section 230 or, or, you know, basically make platforms liable for what people do on them.
And we were, we were, we just kind of, we're all talking about, we're like, you know, if we actually passed really good privacy protections that got at the heart of this problem. And all these problems might not go away, but they would [00:16:00] be very different. They would be very reduced because what people are, you know, all of them, you know, basically use are, are fueled by surveillance, uh, capitalism and the surveillance business model.
So yeah. We wrote the paper to try to pull all these disparate strands together. Um, so what's happened since? Well, you know, at the time, you know, the TikTok ban is a good example. Now we, we, we've seen worries about the Chinese government having access to all our data. Um, but. You know, banning one little app is not actually going to protect people against Chinese spying.
Why do we know this? Because we know about the data broker industry, and we know that there are many, many other apps and many, many other ways. It's like Swiss cheese. It's like hole. It's like locking a tiny little hole in Swiss cheese and missing all the other holes and then pouring the, keep, keep pouring the milk through and thinking that nothing would happen.
That's not my best metaphor, but Um, but it's, it seemed to us completely, it's completely ridiculous that even if you're really worried about [00:17:00] Chinese spying, TikTok isn't where you'd start. I don't even think it'd be on the, like, third page of things that you would worry about. The data broker industry and the data collection, the, you know, general privacy protection for all of the companies that are involved in spying on us, both the direct companies and the, the back end data broker industry, will give people protection against spying.
Much better than banning one little app or two little apps for three little apps. And I think that that, and yeah, TikTok's not little in the context of this, but it's little in terms of how much of our data is being collected by how many, uh, companies at this point. So, um, this is another one, if I was writing the paper today that I would add, like, if you care about national security, privacy first, right?
Give people, you know, uh, ways to stop all this massive data collection at the source and empower them. Um, that's the way to deal with the national security problems that people are raising around data collection, data privacy. It's kind [00:18:00] of funny, right? We've been talking about this problems with data privacy and the need to protect data privacy for a very long time.
Um, and suddenly the U. S. Supreme Court and Congress are like, Oh, we really need to care about data privacy. So here's this tiny little stupid thing we're going to do rather than the big thing. And we're like, wait, go over there. There's a big thing you could do and it would actually help. Um, you know, that's, that's, that's, you know, they're, that's one I would definitely add, um, to it, but we're still seeing, we're seeing age gating the internet, we're seeing, you know, those, those Other proposals are not going away, uh, in the, in 2025 either.
Arlo Gilbert: so talking about 2025, it would be impossible for us to not, uh, not talk about Trump, right? We have a new administration in place and there is inevitably, no matter how anybody It feels about, uh, the, uh, the election, [00:19:00] uh, is undoubtedly that he's the new president and there's going to be impact on lots of different kinds of policy.
So he's, he's doing exactly what we expected. Uh, you know, he's championing AI and innovation. Uh, he's talking about deregulation. Is there anything that have you seen anything that's unexpected regarding digital policy, uh, or anything that perhaps is throwing up red flags?
Cindy Cohn: Well, I think that, um, in general, what we're seeing is really what I think of as deregulation in the kind of libertarian sense, right, of shrinking the government and, and creating, um, more space, uh, for all of us to compete and to live outside of government.
What we're seeing is just actually, I think, shifting the reins of power from one set of people to another. And I think that's important to pay attention to in the frame that you're looking [00:20:00] on, because lots of things are talking about government efficiency or shrinking the size of government. And it may actually reduce a little bit.
But what we're really seeing is one set of people grabbing the reins of power and instead, and they're, they're team, right? We're just, you know, uh, you know, the old Rolling Stones song, you know, meet the new boss, the same as the old boss. We're just seeing bosses change right now for most of what we're seeing.
And I think that's an important frame because I think, especially for a lot of technical people who deal with government regulatory structures, it can feel frustrating and you could wish there would be less hoops. Um, but we need to look at, like, are there actually fewer, the right fewer hoops and Are they, are there actually fewer hoops or is it just that you have to, you have to play up to a different leader now rather than the bureaucrats who are in charge?
So I think that's one thing we're seeing and we're starting to see it really play out that, that this was a campaign that was based on reducing government bureaucracy, [00:21:00] but it's actually just creating, um, a new set of people who are going to be in charge of most of those reins of power.
Arlo Gilbert: Um, speaking of the people in charge.
Um, might be a, an awesome, awesome divergence into, uh, you know, some of the interactions we've seen, you know, between big technology and Trump. I mean, they've all, they've all donated to the inauguration. We see, uh, we saw almost instantaneous post election behavioral changes from all of the big online tech companies, and we're watching.
Elon and Trump, I don't know if they're camping out together in the White House and doing s'mores at night, but it feels like it's that intimate. Um, what is your take on, on the interaction between big tech and, and, you know, I guess I would kind of lump Elon in with big tech as his own thing of, you know, he is a big tech.
[00:22:00] Um, you know, what is, what's, what's your take on the interactions there and the relationships?
Cindy Cohn: Well, I think one thing that we're seeing, and I think it's really sad for a lot of the people who work in tech and care about tech, is that we're seeing a big cleaving between the people who work in tech and build the tools and the CEOs.
The people who lead these companies, it used to be the case, I think that you could join a company and feel pretty strongly that the people who are at the head of the company shared your basic values and look at the world in the same way you do. And I think a lot of tech workers are very sadly recognizing that that's not true anymore for the people who are lead, whether it was true or not before.
Um, it's not true anymore. Um, and that, that, that, I think there's going to be consequences from that cleaving in terms of, of talent, honestly, in terms of people really, you know, it's unfortunate that some of this is happening. And I think actually predictable that this is happening. As we've seen, um, a very competitive [00:23:00] landscape for technology companies that, that, you know, existed when I first got involved in this and in the early 2000s shrink into five big companies.
Um, you know.
Arlo Gilbert: Yeah. It's really amazing how fast it's all consolidated in the last 10 years.
Cindy Cohn: And I think it's that consolidation that makes it possible for the CEOs to chart such a different course from their, from their workers. But I think there's still, there still will be consequences in the long run.
Um, they, they have really. They've really attracted the best and the brightest thinkers into their companies. I think we're going to have a hard time keeping them as the, as the tech company leadership really shifts from thinking about the public interest to currying favor, you know, in kind of a more.
You know, a, a, a less democratic, a less due process, uh, kind of focus. And I think people still believe that, you know, it's better to not have dictators than to have dictators. And as the CEO [00:24:00] shift into the way you function, when you're working under a dictatorship, I think you're going to lose some talent.
And we're going to see, you know, maybe European companies getting the best and the brightest or. You know, other people choosing other things. So I think it's problematic for tech in the long run to, to shift to this dictatorship strategy. Um, I would agree with you. I do human rights work. I've done human rights work alongside what I've done for a long site.
I know what, what happens when you get a, a, a strong leader. I'm not calling Trump a dictator at this point, but I think that there is a strategy for dealing with one kind of leadership that's different than the strategy when you're dealing with. Um, kind of a leadership that is committed to rule of law and due process.
And we're starting to see those kinds of behaviors from the CEOs. And it's, it's getting noticed. I, you know, by the, by the tech world, I also think that, um, you know, it's. It's, [00:25:00] it's going to be short sighted. So one of the things that I've been involved with in the last couple of weeks is that the, you know, many, many of the people who build the tools of privacy and anonymity and free speech, they're used by, you know, people around the world, um, especially people who live under, under actual dictatorships, under repressive governments.
Uh, the State Department has a little program, very small by state terrorist standards, called, you know, that supports a lot of these technologies and doing the kind of plumbing work, right? Keeping tools like Tor or tools that help, you know, create VPNs for people who are living in China or North Korea or other places to be able to do organizing and, and, and, and protect themselves.
Um, they pulled the plug on the funding for a lot of those little groups as part of the broader, uh, executive order and the State Department. Um, and so I think it's really, again, I think that they are [00:26:00] also taking kind of a bulldozer to a lot of things that might, you know, um, There's a lot of babies in the bathwater, let's just say, even if we assume that there's a lot of bathwater out there, they're not paying attention to the babies in the bathwater in a way that, again, I think is short sighted for technology, is short sighted for freedom.
This is, you know, even if you're, if you, even if you're prioritizing certain kinds of freedoms over other kinds of freedoms, if you're prioritizing free speech and anti censorship and religious freedom over some of the other kinds of things that are in the Universal Declaration, these tools help everybody.
And they, they help those people too. And so I'm, I'm sad that this blunderbuss, because people like you, Elan must should know better than that, but you know, sad that's, you know, they get a lot of, um, kind of general applause for being bulldozers and, you know, not caring about the consequences and that, that is, that's what we saw when must go over Twitter as well.
And I think that that's, again, I, I just think it's short [00:27:00] sighted. I think that even, even if you. Um, you know, I think that, that, that making sure that the marginalized people of the world by that, I, I include all sorts of people who, who are doing there, you know, all the way back to, to my childhood in Iowa, it's, um, they need tools to protect their privacy and anonymity is one of the values that we care about for privacy.
It's central for human dignity, but it's also central for organizing and making change. Um, unplugging all of those systems, um, is gonna hurt. Um, it's gonna hurt people in the short run. It obviously is, but in the long run, I think it's an anti innovation strategy.
Arlo Gilbert: Interesting, interesting. So talking, talking about innovation, uh, innovation right now is almost entirely in, at least in the news cycles, uh, entirely dominated by AI, um, and AI, you know, for those of you listening, you know, there's lots of kinds of AI.
The kind that we talk about the most lately is large [00:28:00] language models, but there are many other kinds of machine learning and AI. And I'm really curious, what are you, do you have any concerns around AI and this particular administration? Is there any particular focus on civil rights or, or over reliance on AI?
I could see Elon trying to replace an entire department with, you know, Grok. Um, or whatever they're calling it now, X. AI. Um, what's your take on that?
Cindy Cohn: Yeah, I think that's really problematic, right? We know that with great power comes great responsibility and these technologies are very, very powerful. And EFF has been involved, you know, for many years in trying to.
Make sure that government use of AI is responsible. Um, um, this includes, you know, I mean, people, again, you're right. People are focused on large language models, and we certainly have thoughts about that. But we're just talking about stuff that, honestly, I don't know that even qualifies as AI. Some machine learning and simple [00:29:00] other government decision making that decides, Do you get bail?
Do you get out of jail? Are you getting called in for a lineup because a facial recognition system identified you? Are you going to have access? Are you going to get kids taken away? Because the, the, the, the, the model that the social services are using to decide if you're a good parent or not, um, is, is, is running on an AI.
These kinds of systems. And I, I, I think about separating them in some ways too. The systems that are trying to predict future human behavior. are really where AI fails, and it especially fails people who are marginalized. You know, there's, um, I don't think you can be a professional at this point and not realize that, you know, these AI systems are really bad and misidentify women and people of color, um, at a much higher rate than, than white guys, right?
What does that, you know, why? Because they're replicating some of the biases and the things that they're being trained on. Um, you know, you, you know, garbage in, garbage out. If you're going to train a system up on a [00:30:00] biased system, you're not only going to get bias, you're going to get actually double down bias.
You're going to get more bias. Um, our friends at Human Rights Data Analysis Group, which is my, my husband's, uh, data analysis group, um, discovered that if you've got biased information about, you know, what the cops do going into an AI system at like 50 to 60 percent bias, The actual machine learning algorithm they're using will come out with 70 to 80 percent bias on the other side.
So, um, so it's not just that AI replicates bias in society. It actually is looking for patterns, and when it sees them, it doubles down on them. So if you've got patterns of bias in the training data, now people are getting better at adjusting for that. But that's kind of going backwards, right? Like create a bias system, then try to figure out how to adjust the weights not to be biased anymore.
What a crazy way to think about the world, right? We should start with, uh, But the baseline should be unbiased. So we have been working at this in the context of again, police, you know, this thing called predicted policing has been named a few times, um, things [00:31:00] around, uh, housing decisions about governmental decision making in law enforcement and, and in regular kind of social services, health and human services, things.
And, you know, one of the things that was especially troubling in the first, uh, Trump administration was an effort to try to float a rule. that said that if the, if the machine made a decision about you, it wasn't appealable. There was nothing you could do about it because the machine's decision was final.
I mean, that is exactly backwards, right? Like, that's like
Arlo Gilbert: minority report. I mean, the precogs have decided that you have broken the law or will break the law.
Cindy Cohn: Yeah, and we know that machines, you know, I think this is an area where people who work very closely with these systems know better. And sometimes, you know, we know there is a cognitive thing where people, if a machine makes a decision, people tend to believe it.
Um, but, but I think people who work closely with these machines knows that's exactly backwards, right? That's just a way in which our brain is tricking us and we have to fight that. Um, so that's something we've been doing a lot of. We know that these systems are [00:32:00] bad for predicting human behavior. So whenever you see, uh, AI system, I mean, they're not bad at predicting whether you want shoes or not.
But, but, but I mean, like the kind of human behavior that can get you arrested or could decide whether you get a mortgage or it can be something. Those kinds of systems, or decide whether you're a terrorist, um, those systems are especially bad because the training data, there just aren't very many of them to train up a model and they look very differently.
Um, so these kinds of systems are the ones that we should be very skeptical of. And I do worry that, that this is being lost in the rush to, you know, apply a little AI. It's
Arlo Gilbert: really, it's really fascinating because you really have a tale of two cities right now. I mean, in some regions like Europe. been in some of the states in the United States, there has been really progressive legislation that has said that you can't use machines to do automatic decision making about, you know, health care or, uh, you know, whether or not somebody [00:33:00] gets a mortgage.
But it feels like in the United States, the The, the pull from technology and regulatory is going to be much more about automatic decision making and, and increased reliance. And it just feels like it's at such odds with other countries and other people's views. Are we going into a world where, you know, there's like the U S internet and then there's everybody else's internet because we, we can't play nice.
Cindy Cohn: I don't know. I'm not sure if that's going to play out that way. I think there's a good chance of it. You know, splintering the Internet is, is, is definitely one way that people deal with bad decision making on a government to government level. I think it would be such a loss, right? That the great benefit of the Internet was that, you know, people could really build connections despite physical distance.
And, um, and, and, and, and I think if we Unplug the main, you know, this main benefit. We're [00:34:00] all going to be the, the poor for it. But I do worry about that. And one of the things that the Trump administration did was repeal the Biden administration, um, executive order on AI safety, which I think was really a good attempt to try to set up the guardrails and really frame the thinking about where.
Where AI was appropriate and where it isn't. And, you know, um. I think those are going to end up being important if we don't want people squashed. You know, you can, you can go too much with government. I, again, I'm not a, I don't think the That, uh, government has all the answers either. But I do think government can be useful for setting kind of the outer boundaries, right?
The fence lines that Which we
Arlo Gilbert: need right now. We are in the middle of developing essentially, you know, AGI or close to it. It would probably good to have, be good to have some rules in place.
Cindy Cohn: Yeah, people can get, and again, I don't think that they have to, you know, you could, you could quibble with whether, you know, the European approach, which is very rule based and regulatory is a good idea, we have some concerns about that too, because [00:35:00] I think that's just going to empower the tech giants of today, I think it's going to mean that there aren't, you know, uh, you know, we are also a big fan of open source for models and AI training.
Um, I think that, that, uh, the, the, the deep seek. Um, uh, model is better and more efficient, um, because it relies on the open source models and that, that, that, you know, if you care about innovation, like you shouldn't want to recreate these big walled gardens that, you know, so only Sam Altman gets to decide what.
uses we make of, of technology or, or musk or, you know, for the old current oligarchs. Um, so we're a fan of open source models. I mean, I think that, you know, DeepSeek is, is problematic for some other reasons perhaps, but, but in this, you know, who's better at AI faster, open is winning. And that's really important if you want.
to build, uh, to build a better thing. But what, you know, what Trump did with [00:36:00] pulling up the, pulling up the executive order was really remove governance entirely from, from a lot of these things that in ways that I think are, are going to be problematic. Um, so you want competition, but you don't want to open season, right?
Arlo Gilbert: So, so I, I have, I have two directions, um, you, you, you, you talked about two different things here. So, um, one, you talked about the AI executive order being repealed, um, and You know, when we talk about that, that executive branch, you, in 2020, you gave the Biden administration a transition memo, um, kind of outlining key policy areas, the EFF really thought would need to be focused.
Um, so I guess the first question is, did you do something similar with Trump or are you guys delivering a similar memo to Trump?
Cindy Cohn: We already did. You can find it on our website and, um, you can see, and, and, and it, um, it's a pretty comprehensive view of where EFF. How EFF thinks about policy, the policy space.
Obviously, I've got lawyers in [00:37:00] constitutional law, and I've got technologists building technologies. But in terms of the policy space for the executive branch in Congress, um, it's a pretty comprehensive look of our and our priorities. So, yeah. And, um, yeah, the spoiler alert is it didn't change very much between them, not because the administrations aren't different, but because we're not different.
You know, our policy positions, our philosophy, our view of what's best for, For the, the digital world doesn't really change and we end up, you know, the, the fights look a little different depending on who's in power, but we end up fighting either way because, uh, because you know, it's, it's, it's definitely the case that the, you know, the fighting for the principles of, you know, freedom and, and justice and innovation online, like those principles don't change depending on who's in power.
Got it.
Arlo Gilbert: So if you had to grade. The Biden administration, you know, you gave them this memo in 2020, 2024 comes around, the teacher, the teacher gets to look at [00:38:00] their students and say, how did they do? Yeah. What do you think?
Cindy Cohn: B minus to C, I guess. I mean, cause it's hard cause it's overall, right? You know, um, I think, you know, we agree with the Biden administration's work on network neutrality.
We think that's important for building a competitive landscape. Um, we were supportive of the executive order on AI. We thought that was pretty wise, um, way to look at it and, and kind of set the, the, the blueprint for what AI regulation ought to look like. Um. We think that the increased antitrust enforcement work that was done, um, is important and that the competition law is really important to bring to bear to try to level the playing field and create space for new.
Innovation. Um, and, um, and privacy. I think that the FTC and some of the other parts of the Biden administration did a really good job standing up for privacy in and bringing, you know, serious accountability for privacy [00:39:00] violations. So I think all of those are things that the Biden administration did pretty well.
Um, we Absolutely disagree with them about the national security stuff. They were very strongly in favor of continuing mass surveillance under what's called section 702 of the FISA Amendments Act. Um, and honestly, it was the people on the right side of the aisle who stood with us to try to scale back the mass, uh, the mass spying.
Um, we were able to kick the problem down the road for a couple of years. So it's going to come up again next year. Um, but they also expanded who can receive these mass, some of these mass surveillance orders. So, um, the Biden administration was not good on mass surveillance. Um, and you know, we're, we're going to have another fight next year and this will be a good Situation to see where the rubber meets the road, right?
Are the people in the Republican side of the thing who were concerned about mass surveillance when the Democrats won power? Are they just going to flip and be all fine with it? When they're in power, are they going to stand on the principle that we need to reduce? It's
Arlo Gilbert: funny how [00:40:00] that, that surveillance state works.
You hate it when you're not in power, but as soon as you're in power, it's not as bad as you thought it was.
Cindy Cohn: Exactly. And so you become convinced of your own, you know, you become convinced of your own righteousness, right? And, and that's the real problem. So we'll see, right? I mean, you asked me, you know, what do we, what do we think about, you know, um, the changes that the Trump administration is bringing, and this is going to be one of those tests that we're going to see.
You know, are they going to walk their talk about shrinking the surveillance state, um, or it was that just a ploy to try to, you know, uh, get themselves into the, you know, kind of be the new bosses rather than the old bosses. So we'll see about that. That's an area where I think we gave the Biden administration, I would give them an F, a not failing grade because they sh, I think it's in the interest of the American people to not let national security be like the.
You know, the thing you throw on the table and you win, right? Like rule of law and due process, like it, there shouldn't [00:41:00] be a, you know, basically an escape hatch to all of those protections for Americans, just because you say the words national security and both. The Democrats and the Republicans are, you know, have created this situation.
Um, and the, frankly, the courts. And if we're going to get to a place where we think about the American people is actually having real rights and being a free people. We can't have an escape hatch, uh, in our rights. I mean, you know, this country was founded in a national security fight, right? Like, that's how, you know, like if national security was a, was a, you know, a, a thing that you laid down and stopped all fights like that, the king would have done that and we wouldn't have a country, right?
Like we were not in a national security fight. It can't be the case that it, that national security is suddenly now a reason. That we dispense with all the freedoms that the founding fathers worked so hard to put into place. That's crazy. And it is.
Arlo Gilbert: There's a, there's, there's the [00:42:00] very famous quote, it's, uh, you know, those who will trade, uh, trade privacy for security deserve neither, or something along the lines of.
Cindy Cohn: The Benjamin Franklin quote. And it's those who would trade liberty, uh, for security deserve neither, I believe. There's some version of that. Yeah. Yeah. And I, I think, you know, Ben was right. Um, and the founding fathers were right. You know, I, I, I, I kind of am a constitutional law nerd a little bit, you know, uh, you know, uh, Thomas Jefferson had a cipher, right?
He had, he used cryptography. He and John Adams, you know, Jefferson is in Paris, right? And, and Adams at one point was in England and, you know, Madison is in the US and the rest of the founding fathers are in the US and they're using cryptography to send messages back and forth about how they're going to overthrow the king, right?
Like, I just think that if we're, uh, if we want to not have a world in which we have kings anymore, we need to protect. You know, the, the means of [00:43:00] private communication, like having a private conversation in the digital age, it's, it's not just about protecting, you know, your porn habits, you know, fine, do what you want, I think in a free society, people get to make choices that, uh, that might not be the ones that, you know, uh, other people would make and, and privacy protects that as well.
It protects human dignity, it protects families, but it also has this kind of like foundational, um, freedom. And so I, I think a lot of people working in privacy, you know, sometimes you get caught in the little minutia of it, or people are like, you know, if you have nothing to hide, you should have nothing to fear.
Um, you know, I think that's really short sighted. Like privacy is one of the ways that we protect our freedoms and, you know, just like free speech, like even if you don't want to go out and stand on a street corner and hold a sign, um, free speech is really important kind of piece of the. The plumbing.
That helps keep us society free. I used to [00:44:00] joke that EFF was the plumbers of freedom. Um,
Arlo Gilbert: so, um, you know, we're talking a little bit about people and, and their rights. Um, you know, do you feel like Americans know enough about their rights when it comes to privacy? I mean, you guys are out there fighting these battles in, in courts and in Congress and, uh, and everywhere else. You know, do you sometimes feel like the people you're fighting for don't know and don't appreciate, uh, do they, do they understand what's at stake?
Cindy Cohn: Yes and no. I think that if you do surveys, and you know, they do these surveys regularly, people care about their privacy. I think that there are some people who need to know more. I think that's probably right. There's actually just a nihilism. There's just a sense that there's nothing you can do, that you have to give up.
And again, the companies that want all our data want us to think that, right? They want us to think privacy is dead. Um, and so we just have to get over it. We have to live in this new world [00:45:00] where we have no privacy. But it's in their, you know, like you gotta, you know, who's telling you this? And why are they telling you this?
If they can convince you that your rights don't matter, then their job is done. They don't have to take them away. You take them away yourself, you know? Uh, EFUP was founded by John Perry Barlow and, uh, Barlow, who I miss very much, uh, he used to say, like, nobody gives you your rights. You have to take them.
Um, and, you know, that's the Civil Liberties Organization. That's what we're, we're, we're there for, right? Like if, if, if. It, you know, if everyone was just going to magically give you all your rights, like you wouldn't need them in the Bill of Rights, you wouldn't need them in the Constitution, you wouldn't need any of those things like a benevolent government that respects all your rights doesn't need any of the things that the United States was founded on.
Arlo Gilbert: And I haven't seen a lot of governments that just kind of by default respect all your rights.
Cindy Cohn: Well, that's, you know, a benevolent and besides a benevolent dictatorship. Is not what we want. We want, you know, you know, the answer to a bad dictatorship [00:46:00] isn't a benevolent dictatorship is securing the dictators.
So, um, anyway, so, so I think that that far more people are have given up and think that there's no way to get to privacy. Um, then then probably just don't know that that's a possibility. I do think privacy is a is a basic human right, and it's a basic human rights because all people want it. Nobody wants a government agent sitting in your living room listening to you and your friends talk about, you know, whether you like the government or not, right?
Nobody wants a world in which, you know, kind of, you know, I love this, there's an old movie called The Lives of Others that's about the Stasi in East Germany and the, you know, basically the guy who listens in to an activist. Um, like nobody wants to live in that world, right? Um, and. Um, but I think that we have been sold a bill of goods that the only way to make money on the Internet, the only business model that works on the Internet is one [00:47:00] that surveils all of us.
I don't think that's true. Um, you know, I mean, you know, maybe it means that, that, that, that the CEOs don't get their second island and, and maybe they only get one island, but there's plenty of money to be made in other business models other than the surveillance business model. That's just a crock that they're trying to sell.
And, um, and, and so I think that there, you know, or that, you know, there's a business model of actually, you know, like advertising doesn't have to be surveillance advertising, right? There's lots of ways that, yes, I'm not saying that no advertising can happen on the internet. I'm saying the kind of creepy kind of advertising that tracks everything you do, feeds it into an algorithm and tries to predict what you're going to do next.
That extra money that they make, that tiny little margin. Isn't worth the trade off for society. And that's why we think that that surveillance advertising can go away and the Internet can Um, I think that people need the believe that they have, they can have power and [00:48:00] that they can fight back against this and that the world will be okay if they do.
And I think all of those things are true. Um, it's just hard, right? You know, that we're being drowned out by the people who have all the money and all the power who are trying to convince everybody that there's nothing that they can do.
Arlo Gilbert: All right. So, you know, one of the things that we like to ask on the show, uh, you know, in, in the vein of surveillance capitalism, which is a very scary word.
I mean, just, just, just the phrase surveillance capitalism sounds like it ought to have a done. Dun, dun, uh, audio sound right after it. Is there anything that you do that you think of as like a, uh, guilty pleasure or, uh, something that you do that you wouldn't really recommend that other folks do with regards to their data or their use of their data?
Cindy Cohn: Yeah. I mean, Yes, of course. I would say that the thing that I feel really strongly about is I, [00:49:00] part of the reason I love this job and the stuff we're doing is I like all the toys. I want all the stuff. I really reject the idea that the only way, as I said, to have this kind of stuff is to have surveillance.
I want all the goodies without all the surveillance. And so I feel like it's yeah. It's kind of, it shouldn't be a guilty pleasure to use your MAPS program or to use, you know, social media even, you know, I mean, I use, you know, I, I, I have shifted away from the surveillance business models. I, I did that long before it was cool to, you know, um, but.
I don't like that I miss the things that I miss, the people that I miss who are not there. I don't think the abstinence kind of be a digital monk model is the right way or feel bad if you use technologies is a bad model. So I just kind of want to push back on your thing. I don't actually think treating this as a guilty pleasure is the right model because that makes people feel like they're the bad [00:50:00] ones when they use these technologies as opposed to they're the ones who are being.
Forced into this ridiculous choice that nobody should be forced into, right? Like, you know, nobody. Nobody sells you a car without brakes, right? We don't say, oh, well, here's this car, but then you have to go out and sort out a 20 million different ways to decide whether you want brakes and windshield wipers, right?
Like, and nobody would feel guilty because they put brakes on their car, right? Like, you know, um, so I just feel like we deserve all the toys. We deserve all the goodies. We deserve all the innovations and we deserve not to feel bad because of the business model that they're on. And the way to do that sadly is comprehensive privacy protections, right?
They've, you know, um, that's, that's the thing, like, just like nobody can sell you a car without brakes, nobody should be able to sell you a social media thing that. that tracks you and, and puts you at digital. It's that security risk as well as privacy risk.
Arlo Gilbert: I love it. A contrary, uh, the, uh, contrarian answer.[00:51:00]
Cindy Cohn: Well, you know, I'm a fighty lawyer, right? I'm going to fight the hypo, so. Gotta love it.
Arlo Gilbert: Well, um, Cindy, thank you so much for joining us today. It's been a real pleasure having you here and we really appreciate all the hard work. That you and the EFF are doing on behalf of all of the citizens of the United States and, uh, and the citizens of the world.
So thank you for joining us today.
Cindy Cohn: Thank you so much. You know, I think that I really appreciate you having this and speaking to the community that you speak to, you know. Um, I have a podcast too called How to Fix the Internet. I really feel like people are circling the drain about how things are and that unless we have a vision for a better world, like we won't get there, right?
You can't build a better world unless you can see it. So I appreciate you continuing to lift up, like how do we, how do we get from here to there?
Arlo Gilbert: Thank you for listening to this episode of the Privacy Insider Podcast. You can find a full transcript of this episode and any show notes. at osano. com. That's www.
[00:52:00] osano. com. And while you're there, get access to an excerpt of my book, The Privacy Insider, How to Embrace Data Privacy and Join the Next Wave of Trusted Brands, which is now available on Amazon for purchase. Until next month, take care. And remember, data privacy is a fundamental human right, y'all.
Arlo Gilbert is the host of The Privacy Insider Podcast, CEO and cofounder of Osano, and author of The Privacy Insider Book. An Austin, Texas, native, he has been building software companies for more than twenty-five years in categories including telecom, payments, procurement, and compliance.
With Osano, building, managing, and scaling your privacy program becomes simple. Schedule a demo or try a free 30-day trial today.