Privacy is the New Celebrity

Ep 8 - Jillian York on Corporate Censorship, Surveillance Capitalism, and the Problem with Content Moderation

October 07, 2021
Privacy is the New Celebrity
Ep 8 - Jillian York on Corporate Censorship, Surveillance Capitalism, and the Problem with Content Moderation
Show Notes Transcript

In episode 8, MobileCoin's head of business development Brady Forrest interviews Jillian York.  Jillian is the Director for International Freedom of Expression at the Electronic Frontier Foundation, and she just wrote a new book called Silicon Values: The Future of Free Speech Under Surveillance Capitalism.  Brady and Jillian discuss why companies like Facebook and Twitter have had such major problems with content moderation, and debate the boundaries between necessary de-platforming and unintended consequences on free speech. Jillian explains the hypocrisy behind companies over-focusing on content like nudity and women's bodies while frequently ignoring hate speech and incitement of violence. Brady and Jillian talk about the differences between content moderation in Germany and the US, and whether it's possible for the internet to become both freer and safer at the same time.

[00:08] - Speaker 1
Welcome back to episode eight of Privacy is The New Celebrity, a podcast about Privacy and technology. My name is Brady Forrest, head of business development at Mobile Coin, and I'm Super excited to be sitting in the host chair for the very first time. And for my debut interview, I'm honored to welcome Jillian Newark onto the show. Jillian is the director for International Freedom of Expression at the Electronic Frontier Foundation. Her work focuses on state and corporate censorship and its impact on human rights, with an emphasis on marginalized communities.

[00:46] - Speaker 1
Gillian is also the author of the new book, Silicon Valley The Future of Free Speech Under Surveillance Capitalism, which takes a hard look at how corporations have diminished our freedom of expression through censorship. Gillian, thank you so much for taking the time to join us on Privacy as the New celebrity.

[01:06] - Speaker 2
Thank you for having me.

[01:10] - Speaker 1
So one of the reasons I wanted to talk to you is because you just wrote this book, Silicon Valley, all about surveillance capitalism, and that's one of the things we work on. Can you tell us what it's about and why you decided to write the book?

[01:24] - Speaker 2
Sure. So about a decade ago, I started really wondering why it was that corporations had so much power over our speech in the United States and really all over the world. I don't come from a legal background, and I'd been living in Morocco teaching English for a few years. And really, that's where I kind of encountered Internet censorship for the first time. What I learned was that the government there was censoring the Internet by blocking things at the root. But there was a situation that arose that really made me kind of question what role Facebook played in all of this.

[01:58] - Speaker 2
And that was that a young person that I knew had created a page on Facebook that called for the separation of education and religion. It's kind of crossing one of the red lines in that country. And that page was taken down from Facebook not once, but repeatedly. Every time he would put it back up, it would be removed. And so my question was, who's doing this? Is it the government? Is it Facebook? I never actually got the answer to that specific question, but it basically threw me down a whole rabbit hole of questioning what role corporations had in all of this in governing our speech, in deciding what we can say, how we can express ourselves and where the law fits in.

[02:41] - Speaker 2
And so it's been a journey. And the book really covers that journey, as well as some of the more recent questions that have arisen with respect to content moderation and surveillance capitalism.

[02:52] - Speaker 1
So where do you see this going for different corporations? Who do you think can stop this kind of deep platforming?

[03:02] - Speaker 2
Yeah. I think we're at a precipice right now.

[03:05] - Speaker 1
Really?

[03:06] - Speaker 2
I think that Facebook itself, Facebook being kind of the BMW, the giant company out of all of them, and also the one that receives the most criticism and probably rightly. So Facebook is not going to solve these problems. And I think first we have to recognize that. So let's get that out of the way. They're not getting this right. They've tried a bunch of different things. They've put an oversight board in place, out external to the company to try to solve these problems for them. And even that is they're doing a good job, but it's not going to find a solution.

[03:40] - Speaker 2
Then you've got companies like Google, which is also quite big and really is kind of taking a step back and is not really engaging on these big societal questions. And then you've got Twitter, which I think is doing its best, really trying to find a balance between allowing free expression. They always call themselves the free speech wing of the free speech party, and, of course, tackling some of the things that have come up on that platform, like recruitment to terrorist organizations and harassment of women and other individuals.

[04:12] - Speaker 2
And so those are the three big ones that everyone talks about. But I don't think that any one of them is getting everything right. And I think that the solutions are not going to come from them, but possibly from the next generation and from the individuals, the experts, the academics, civil society activists, et cetera, who have been studying and working on this issue for so many years.

[04:33] - Speaker 1
Who are some of the people you look up to in this regard, who are some of the folks you regard as mentors?

[04:40] - Speaker 2
That is such a good question. A big one for me is Rebecca McKinnon. She's someone who was looking at this question before I was she has a background in China and journalism and had been looking at the ways in which China was censoring the Internet and specifically using companies within their territory to do so. That was really early days, and I think she's a real pioneer in this space. Her book, Consent of the Networks, which came out in I think 2012, is foundational reading on this. But there's a lot of other people that I look up to in this space.

[05:16] - Speaker 2
Peers like Diya Kylee, Kate Clonick, folks like Shoshana Zuboff, of course, wrote Surveillance Capitalism, which I drew on for the title of my book. And many of the people all over the world that I've partnered with over the years. Too many to name. Many of them are in the Acknowledgments of my book. But there's just such an incredible array of people who are tackling these problems.

[05:41] - Speaker 1
Can you tell me how you define surveillance capitalism and how it might differ from Shannon's definition?

[05:47] - Speaker 2
I'm not sure that my definition really does differ from hers. I mean, the way that I see surveillance capitalism, and I'm obviously paraphrasing from her here. She wrote a very long book about this, but I see Surveillance capitalism is a system in which companies are sucking up all of our data, taking our most personal thoughts, our most personal ideas and questions and images, et cetera, to sell things back to us, as they say, you are the product, not the customer.

[06:17] - Speaker 1
When you're right, you're often referring to corporate censorship. Can you give a few more examples?

[06:24] - Speaker 2
Sure. So when I talk about corporate censorship and I think that there's a lot of confusion, especially in the US, around the question of the word censorship, I think that people tend to conflate it with the First Amendment. But censorship is not inherently, by definition, an act that has to be carried out by a state. So historically, we've seen a variety of different actors take on the mantle of censorship from the Church or the mosque or whichever religious body you prefer to. Corporations. Over the years. And corporations, a company town historically in the United States, for example, has played a role in limiting people's ability to, for example, spread the word about protests and malls.

[07:07] - Speaker 2
The shopping mall. There's court cases in California, actually around whether or not you have the right to wear a certain type of clothing or protest something inside of a shopping mall. And so for me, corporate censorship today takes place largely online, although, of course, it still exists offline as well. It's the idea that these corporations, which are not elected, they are not Democratic. And yet they do put themselves forth as places for dialogue, public dialogue, free expression. They have so much control over what we can say and what we can do that I do feel that it's right to call it censorship.

[07:44] - Speaker 2
To give a couple of examples. One example is when a platform like Twitter utilizes AI to identify any instance of profanity. So this is something that happens quite often on Twitter. If I reply to someone and I use the F word, for example, it will catch it in their filter, and it will ask me, do you really want to say this now? Am I being censored?

[08:07]
No.

[08:07] - Speaker 2
I'm being prompted to check in with me if I really want to use that particular terminology. But I've noticed that that list is actually quite blunt. It's a very blunt object. And we've seen this throughout history, the history of the Internet, specifically, when words that can have multiple meetings or a word that is contained within a word gets caught up in a filter like that, it can have a chilling effect on speech. So that's one really good example. Another, one, of course, would be when a company like Twitter chooses to take down the tweets of Donald Trump or to give what I think is maybe even a better example.

[08:47] - Speaker 2
The President of Nigeria, who in turn turned around and blocked Twitter. Com from his entire country.

[08:53] - Speaker 1
Wow. Talk about an ego. So you mentioned you lived in Morocco?

[08:59] - Speaker 2
Yes.

[09:00] - Speaker 1
Where censorship is much more problem in the US. What was that like? And what sort of people were being censored.

[09:06] - Speaker 2
Yeah. So Morocco has a really interesting history around this before I lived there. So I moved to Morocco in about 2005, and the King, who is currently in power, took the throne in, I think, 1999, if I recall correctly, his father, Hassan Two, was actually very strict in terms of what could or could not be said. And there was a lot of press censorship in his day around a wide variety of topics. Now, when the new King, King Mohammed VII, came into power, he brought forth all of these different reforms to society.

[09:39] - Speaker 2
And one of the things that he said was that the press was going to be a lot more open. And yet Morocco still has three really bright lines of things that you can't talk about. You can't criticize Islam, you really shouldn't talk much about the Royal family, and you shouldn't talk about the occupation of the Western Sahara that's been ongoing since 1975. Now, if you're talking about these things on the street with your friends, you're probably fine. This is not a really hardcore dictatorship where you've got spies on every corner.

[10:10] - Speaker 2
But there is a significant amount of press censorship. And that goes for offline and online media during the time that I lived there. And things have only gotten worse since. But back then, there were newspapers seized for reporting that the approval rating of some politician was like only 97% instead of 100%. There were things seized for being too lascivious, being too lewd, being too critical of the government. It's mostly journalists who are affected, but it's also activists, rappers, writers, singers, other artists. The list goes on, wow.

[10:50] - Speaker 1
How do you feel for artists in the US and the EU? Do you think they are also at risk of this type of censorship?

[10:58] - Speaker 2
Yes, I do think that artists are at risk, and I think it depends on the topic that their art is about. So obviously, in the US and the EU, there is quite a bit of artistic freedom. There's a wide range of art, and even when it comes to things like sexual content new to art, I think that when you go to galleries or museums in the US or Europe, you will see a wide range of things. But those same rules don't apply on the Internet. What we've seen specifically around nude expression and sexual expression is an increased crackdown over the past few years.

[11:35] - Speaker 2
It's hard to pinpoint where exactly it started, but I would say that the root cause of it does come from a particular sense of American morality. In my book, I call this the War on Sex, and I kind of draw it back to the early days of Internet porn, but it's really hard to say, and I think that when it comes down to it, the US has never quite been comfortable with these ideas, although they are allowed to exist in the elite art spaces in the Academy, et cetera online is for everyone.

[12:04] - Speaker 2
And so the rules that companies put forth at times, bills have put forth like Cessta fawka a couple of years ago, the bill that was passed to protect people against sex trafficking, but ended up actually leading to the censorship of a great deal of sexual content. Things like that have really resulted in a chilling effect for sexual and nude expression. Yes.

[12:31] - Speaker 1
And we're seeing that right now with only fans, which is really a morality play on credit card companies. Yeah. What is your take on that story right now? I know you've written about it in the past.

[12:45] - Speaker 2
Yeah, but I think there's a couple of things going on. Only fans itself or the people who lead that company. They do seem to want to provide a space for people to engage in sexual expression. They did roll back after protests, the rule that they put forth saying that they were going to cut off sexual expression from their platform. And so it does seem like the people behind it do care about their users. And yet at the same time, they're clearly under a great deal of pressure from Mastercard and from perhaps other invisible forces, interest groups and whatnot to restrict that content.

[13:19] - Speaker 2
And so that's really what it comes down to is that when things can't be done within the framework of the law in the US, there are lobby groups that will ensure that they get done some other way. And these days it's very easy to go after a tech company and lobby them to do what you want them to do. Sometimes that's to the benefit of the public when it comes to things like, say, moderating, some of the worst of extremist speech. But oftentimes it's really to the detriment of a really large range of society.

[13:53] - Speaker 1
Only fans was written about quite a bit. But is there any issue or topic within this realm that you think more people need to know about, or that's not getting enough press?

[14:05] - Speaker 2
I think that it's not so much a topic but an angle of the story. And that is that while all of these fights are going on, a lot of them are tackling sexual content on mainstream platforms. So there's fights to get all sexual content and nudity off of Facebook, off of only fans, off of Instagram, et cetera. And yet when we do that, what happens is that a young person goes online and they seek out sexual information or nude imagery or whatever it is to see that they're normal or what have you and what they encounter is what's left hardcore pornography.

[14:45] - Speaker 2
And that, to me, is really concerning. Now, I personally don't have anything against porn to each their own. But I think that the best case scenario for a young person going online is that they have access to a wide range of information, and that includes informative medical information about sexual health. That includes humanistic portrayals of the human body and not airbrushed and casting halt style porn. I think it's really important. And so I really think that we have to think harder about the ways in which we seek to control this kind of content, how we want kids to access the Internet and what we want them to be able to see and what the long term impact is of pushing all of this stuff off of only the mainstream platforms.

[15:34] - Speaker 1
Yeah. When I'm at a spa in Germany, families are walking around naked and they're all different body types. It's not nearly as sexualized as it is in the US where everyone has to cover themselves up. It's such a different approach to the human body, and one that would be great to see more online.

[15:58] - Speaker 2
Yeah. Exactly. And I think that that's the thing, too, is that all of these platforms are or all the major platforms, anyway, are based in the US headquartered there, have American CEOs for the most part, and often from very particular backgrounds. And they're not thinking about the fact that other cultures may not see sex and nudity in the same way. So right now, there's this one really virulent strain of transphobia that I'm seeing around women rightfully being concerned about men being in their spaces. And yet they're targeting trans women over this.

[16:34] - Speaker 1
Right.

[16:35] - Speaker 2
And on the one hand, I see where their concerns are coming from. On the other hand, I don't share them. And further, I would say that it's really interesting coming from the German perspective where it's not a big deal for a child to be in the same space as the genitals of an opposite gender person. And so I do think that there is a lot more kind of cultural conversation that needs to be had about this and a lot more recognition that the values being put forth online and on the spaces that we all occupy are a very specific set of, if I may, Silicon values.

[17:15] - Speaker 1
Nice plug. Thank you. So sexual content is just one complex issue in censorship. But you also write about censorship and politics such as Facebook and Twitter. Deplatforming Donald Trump, for instance, what's your take on these decisions? Should we be glad that tech companies have stepped in to stop eat speech, or should we be concerned about the precedent that sets?

[17:41] - Speaker 2
I think that we need to be concerned about the precedent that it sets. I mean, look, I'm no fan of Donald Trump, and I think that some of the things that he was putting out there on these platforms, especially during the Capitol riots, were dangerous. He was essentially goading his followers to attack the Capitol building, and that's obviously not okay. But is Mark Zuckerberg the person that we want to make those decisions? What do we do in the event that we have a leader, an elected leader, no less who goes off the rails like that for lack of a better term these kinds of questions are really big ones.

[18:17] - Speaker 2
And I worry that we put way too much stock in tech companies to make these decisions for us.

[18:23] - Speaker 1
But doesn't a company like Twitter have the ability to say we do not want to have calls for violence? We do not want harassment of our users and the ability to remove those users, especially when they're laid out clear as very clear rules and guidelines.

[18:39] - Speaker 2
Yeah. So first off, that is what the law says, but with great power comes great responsibility. Now, first of all, you say very clear rules and guidelines. I would say that most of these platforms historically and presently do not have very clear rules and guidelines, and that's the first problem. So let's say that they do. Let's say tomorrow they all come together and put forth very clear rules that everyone can understand that are translated to every possible language being used on these platforms. That would be an excellent first step, because right now, when I read the Facebook rules, I leave dizzy confused.

[19:16] - Speaker 1
Now.

[19:17] - Speaker 2
I think the second thing, though, is that they need to acknowledge that they make mistakes. And I think these companies are starting to do that. But they still don't have particularly robust appeal systems, nor do they have particularly robust transparency around the kinds of content decisions they make and mistakes that they make. And that includes both kind of publishing the numbers as they do in their transparency reports. But it also includes being clear with the user directly about how that person messed up and what they can do to rectify the situation.

[19:49] - Speaker 2
That's the kind of thing that I would like to see happen.

[19:51] - Speaker 1
And that's really we're not there yet another example of a possibly bad precedent is in the news lately. It's been a lot about the chaotic removal of US troops from Afghanistan, and this is something we touched on our last episode as well. And you've written about Facebook's choice to ban the Taliban from the social media platform and how it's actually had some unintended consequences. Can you tell us more about that?

[20:19] - Speaker 2
Yeah. So when it comes to groups, the United States teams, terrorist organizations and the Taliban is one of them, and there's a lot of complexity here. And I should say I'm not a lawyer. So I'm not going to try to get into the nitty gritty details of that. But the short version of it is that the State Department and the Treasury Department both have their own lists that have some legal requirements around them. So Taliban qualifies under some of that. So Facebook made the decision to ban the Taliban entirely from their platform, as they have in the past, with groups like Hamas and Hezbollah and Al Qaeda and ISIS not to group them all as one.

[20:57] - Speaker 2
But all of those groups exist on these US lists. And so for me, there are a couple of big questions raised by this action. The first is as a global society, why do we think that the US should have the final say in these matters? Now, of course, it's a legal issue. These companies do have to comply with the law. But I think that it is a bigger question than that. The US has not always been a great arbiter of who is or is not a terrorist.

[21:27] - Speaker 2
For one, they're not great at identifying and calling domestic terrorists when they see one or when they see them, particularly if those people are white. But second, the US has also historically politicized its terrorism watch list. There's a famous example from around 2012 of the MEK, an Iranian group that was delisted after a conservative campaign to get it off the State Farm list. So bringing it all back to now, to the present day, when it comes to the Taliban, I think this raises some new questions that we don't have clear answers for in our existing frameworks.

[22:03] - Speaker 2
The biggest one for me is, what do you do when a group like that takes power over an entire nation? But at the same time, that group has no respect for women's rights, has a different version of human rights than what maybe the rest of the world might have or much of the world has. And then at the same time, this governing body also is using or may need to use the Internet or these platforms to reach its citizens. And so what is the right answer here?

[22:35] - Speaker 2
I'm not sure that any of us really know, and that's some of the stuff that I'm trying to tackle at the moment, in my mind.

[22:43] - Speaker 1
As someone who works at a Privacy company, I think a lot about how technology has been both good and bad for Privacy. What do you see as the greatest risks to Privacy that technology has created?

[22:56] - Speaker 2
I think that the greatest risk to Privacy created by technology is probably the ubiquity of cameras.

[23:02] - Speaker 1
Honestly.

[23:04] - Speaker 2
There'S a lot of threats out there, but every time I go outside of Germany, I'm really struck by just how much cameras are truly everywhere, not just the cameras that we carry around in our pockets, of course, but the ones in every train station, on the outside of every shop. Occasionally I've even seen them inside of a bathroom. I think that it's ubiquity of cameras that really strikes me as the greatest threat to Privacy.

[23:30] - Speaker 1
Yeah. When reading all these articles about the club scene opening up in Berlin. And I know that when you walk into one of those spaces, they immediately give you stickers to cover all the cameras. So three, two on the back, another one or two on the front, like they know how many stickers they need for each model of phone.

[23:54] - Speaker 2
Yeah. Germany is a really interesting case, right. And I think a lot of it comes down to the history here. This is a country that had or half of the country anyway, or a portion of it under the GDR had really ubiquitous surveillance. Had neighbors spying on neighbors cameras by cams of the time, if you ever have a chance to check out the Stasi Museum, it's really worth looking at some of the tools that they use to spy on and then manipulate their citizens. And so I think that Germans are much more than many other people in the world wary of the ubiquitous surveillance that we're seeing crop up everywhere else.

[24:32] - Speaker 2
And yet it is still happening here, not in the clubs. Fortunately, it's absolutely true. They cover the cameras, and I love that it makes me feel safe. It makes me feel able to express myself. I feel safer without those cameras there. But here we still have things like a trial that's been going on at Sid Croix, one of the Espon stations where they're pioneering some facial recognition technology. So I do worry that Germany may still go in that direction eventually if we don't fight to stop it.

[25:02] - Speaker 1
Staying on Germany in 2017, they passed what's probably one of the world's toughest laws in terms of regulating hate speech online. It requires companies like Facebook and Twitter to delete illegal content within 24 hours of being notified or risk fines of up to €50 million for our American listeners. Can you tell us first how Germany has decided to determine which online content is illegal and what gets taken down?

[25:30] - Speaker 2
Yes. So I can't get into the deep contours of this law personally. But what I can say is that Germany, because of the other piece of its history, the Holocaust and the Nazi era, has very, very strict laws around hate speech as well as harassment. Some other things. When it comes to hate speech, Germany has always taken it seriously. And before this law, you could in fact, be fined for publicly, let's say, graffitiing eswastica or taking up a Speaker's corner and saying things that violated the laws against hate speech.

[26:03] - Speaker 2
Here. The way that this law has been applied online is by, as you said, creating a law that instead of holding the individual accountable, although they still could be potentially holds the platforms accountable. I'm not a big fan of this law, the net CG, as they call it, because a couple of reasons. First of all, I think that yes, I understand the anger at the platforms that exists from many governments, from many people, but trying to find companies for something that is actually the responsibility of individuals seems really problematic to me.

[26:40] - Speaker 2
On the other hand, there's also the fact that I don't feel that Germany really thought long term about what their creation of a law like this would mean for the rest of the world. And what we've seen since that law came about is more than a dozen other countries, including a variety of very authoritarian countries, coming up with their own iterations of this law that are even worse than the original and so Germany has really created a monster with this one. And I don't think that this is the right way forward in terms of the goal that I assume we all share, which is to stop such awful extremist speech.

[27:22] - Speaker 1
What do you think would be a better path?

[27:25] - Speaker 2
Well, so for me, I'm not ever really convinced that censorship is the answer. Now, I'm not saying that platforms shouldn't or can't restrict content. I accept that as reasonable. I accept it as inevitability. I accept it even from a business perspective. But I don't feel that it's the solution. And I think all too often we treat censorship as a solution to the problem rather than as the Band Aid that it really is. Now, if we want to solve problems like hate, we have to tackle the root causes of hate.

[27:57] - Speaker 2
If we want to solve disinformation, we have to tackle the root causes of disinformation. And so we have to look back to the very beginnings, right? I'm going to shift and talk about disinformation for just a second. When I think about disinformation, I think about the stuff that I was taught in school. I was taught that Columbus discovered America. I was taught that the US was the Victor of World War II, all of these things that are really not just subjective in some ways, just not true.

[28:26] - Speaker 2
And so the same goes for hate speech. Hate speech is a terrible thing. I don't support it. I'm not one of those free speech believers who will fight to the death for your right to say hateful things. Rather, I question which authority do we trust to make these decisions? Even some of the European States that ban Holocaust denial don't ban denial of the Armenian genocide or other genocides? And so are we picking and choosing? Is that the way that we want our laws to be? Or do we want to focus on ensuring that these things never happen again?

[29:01] - Speaker 1
This argument makes sense, but content that promotes racism misogyny. Do you really think that should be allowed to live online where millions can access it?

[29:11] - Speaker 2
I'm not saying that I think it should be allowed. I mean, again, I do think that in an ideal world that stuff would be taken down. The problem is we don't live in an ideal world, and these platforms have shown time and time again that they're not interested in investing the resources that it takes to get that kind of content moderation, right? And so, yes, take down the worst of the worst, take down the incitement. I have no problem with that at all. Neither does the law.

[29:36] - Speaker 2
Even in the United States, where free speech reigns supreme, the problem that I have is that the platforms over focus on content that is not harmful, such as women's nipples and under focus on content that is truly harmful, such as incitement and harassment of marginalized and vulnerable communities. I'm unconvinced at this point after studying this for more than a decade that these platforms are capable of doing this right. And increasingly, as they turn to automation, to machine learning algorithms to do the job for us instead of putting humans at the table, I think it's going to go really wrong.

[30:15] - Speaker 2
And so I think we have to be prepared for that risk. And I think that we have to talk about new ways to tackle these issues again. I think it's okay if they take down the worst of the worst stuff, but I think that we have to rethink the system of content moderation from the ground up.

[30:28] - Speaker 1
So is there anyone that you think is doing content moderation?

[30:33] - Speaker 2
Well, I don't think that there's any one. I have to say, to be really honest, I think that Twitter does a better job than many of its competitors. But Twitter also has a simpler job and than many of its competitors, because the vast majority of the platform is text, text is in some ways easier to deal with. Tweets are short. They're brief, you can't say a whole lot in them, and the way that they've created threads makes it very easy for them to take down an entire thread at a time.

[31:05] - Speaker 2
So they're doing a better job than some. I think that on balance, their rules makes a lot more sense than other platforms, but of course, they're not perfect. I think that platforms like Reddit are doing a really interesting job as well by turning some of the I mean, they do have some centralized content moderation, but they turn a lot of the power over to their users and their subreddit moderators who are volunteers. And so I think that that's a really interesting model. I mean, I think that this kind of experimentation is important, and I think that a system that requires having these low wage workers working for third party firms the way that Facebook does it is definitely not the answer.

[31:44] - Speaker 2
So I know what's not the answer. And I know some of the things that are part of the answer, but I'm not sure that any of us yet have come up with the answer.

[31:52] - Speaker 1
So whose responsibility is it?

[31:55] - Speaker 2
I think it's all of ours, really. I would like to see the most Democratic process possible.

[32:02] - Speaker 1
And.

[32:04] - Speaker 2
I'm not in the business of creating systems of governance. I've certainly contributed to a lot of these conversations, and we'll continue to do so. But I don't think that we should really trust anyone who says that they have all the answers. I think that we have to bring a wide range, a diverse range of people to the table, and I mean diversity in every aspect of the word to figure out a way forward. And that's really what my book is about. I leave it on that note.

[32:33] - Speaker 2
If I were saying that I had all the answers, I would hope that people would be skeptical of me. I think not to be, too trait here. But I think it really does take a village or even an entire globe to tackle some of humanity's biggest problems. And when we look back at history and how we've dealt with this before on a global level, I think the Universal Declaration of Human Rights and the various charters that came after it are excellent examples of what it means when you have a diverse group of people come together to try to decide on something that people can really agree on.

[33:04] - Speaker 2
And so let's get creative. Let's think outside of the existing frameworks that we have and come up with something.

[33:11] - Speaker 1
I want to switch gears for a second and ask you a couple of questions that we like to ask all of our guests. And the first one is what is a Privacy technology that does not exist right now but should exist?

[33:25] - Speaker 2
Oh, that's a great question. I don't know, because I'm guessing almost anything. I don't focus on Privacy enough that I worry that if I say something definitively here, somebody's going to be like, oh, that does exist, though. So let me just put out there one of the thoughts that I've had, and I'm sure that this exists in some way or another, but I would love to be able to block my face from being recognized by facial recognition technology. And I have heard of little things here and there makeup and other experiments that have occurred, but I know that it's getting increasingly difficult to bypass that stuff.

[33:58] - Speaker 2
And so that's the kind of thing that I would really like to see more innovation around.

[34:02] - Speaker 1
Oh, I am increasingly annoying. If you're going to do that to avoid gate detection, you have to put a pebble in your shoe to change how you walk.

[34:11] - Speaker 2
Yes, I broke my toe earlier this year and had an aircraft, and I'm pretty sure I would have evaded it for a brief couple of months there.

[34:18] - Speaker 1
So you started off with part of your origin story, but I bet your desire for Privacy went back much further. When did you first realize that Privacy was important to you?

[34:31] - Speaker 2
That's a really good question, too. I'm not really sure, but I would guess it was probably sometime around my teenage years. I think as adolescents, there are a lot of things that we want to get into, whether it's reading books or watching films that our parents don't necessarily want us to watch, or maybe even more risque things than that. And I think that that was probably around the time where I realized how important Privacy was and just what a privilege it is to have it.

[34:59] - Speaker 1
One of the questions we like to ask each of our guests is whether they agree with the notion that Privacy is the new celebrity. What do you think? Yeah.

[35:09] - Speaker 2
When you first said that the title, I was like, I'm not sure what that really means, but yeah, I think it's a provocative statement, right. The idea that Privacy has become that important, that it's something that we're all focused on, that we're all following that we're all thinking about. So yeah, that's what it means to me.

[35:28] - Speaker 1
Yeah. I think it's becoming more and more important every day. And as much as much value as I get out of platforms like Facebook, I'm constantly concerned about the amount of data that they're hoovering up.

[35:43] - Speaker 2
Yes.

[35:44] - Speaker 1
Do you see a link between Privacy and the work you do regarding censorship?

[35:49] - Speaker 2
I do. I think that Privacy and free expression are inextricably linked in a number of different ways. So I'll just focus on one for a moment. I think one of them is, unfortunately, the societal concept of shame. So when we're trying to keep certain things private, sometimes that comes from other places, but sometimes it does come from shame or a feeling similar to that. And so if we didn't have that shame, we might be freer to express ourselves. And to me that's one of the biggest links between Privacy and free expression is what would it look like in a society that didn't have the concept of shame?

[36:26] - Speaker 2
If we weren't ashamed of our bodies, ashamed of our activities, ashamed of our predilections.

[36:32] - Speaker 1
We do a lot of work with artists, and we've had some artists on the show, and there's a lot of discussion about how Privacy is part of the act of creation. You want to be able to go in a room and brainstorm and experiment and not have that exposed to the world taking a broad view. Do you think technology has been good thing for freedom of expression or does it cause more harm than good?

[37:02] - Speaker 2
I think it's hard to say. I think on the one hand, technology has definitely been good for freedom of expression and access to information, which is another thing that is obviously inextricably linked with the concept. There's so many people around the world living in societies where the state or other actors are behaving as the censor. And so the internet has opened up people's worlds, even though I grew up in the US where freedom of expression does by and large exist pretty widely. My world was still pretty small growing up, and so the Internet opened up a portal to me to places that I'd never even heard of, two ideas that had never crossed my mind to all kinds of things and to access different kinds of people.

[37:47] - Speaker 2
And so I do think that technology and the Internet in particular has been a really strong thing for freedom of expression. I don't want to say a net positive, because obviously, I do think that it has also brought about new ways of control and manipulation, and that worries me as well.

[38:05] - Speaker 1
Yeah. Definitely control manipulation of the populace is definitely something to be concerned about. It's definitely something that is enabled by technology, but it's also new forms of creativity and tools and connecting people together. But on the whole, are you optimistic that we're moving in the direction of more freedom of expression while also figuring out how to moderate or do you feel a bit of despair in the direction we're moving?

[38:35] - Speaker 2
I would say that I've been feeling a bit of despair throughout the pandemic. In particular, I think that some of the things that occurred at the very beginning of the pandemic when content moderation workers were sent home was really alarming to me because it did take away a lot of the resources of content moderation, and it made it much more difficult to get these things right now. Over the past year and a half, obviously, a lot has happened and a lot of information has come out that has advanced the dialogue around these issues.

[39:05] - Speaker 2
And so right now, I would say that I am optimistic. I mean, I'm optimistic for a number of reasons, not just because I think that the conversation and the reporting on these issues has really progressed, but also because of the large number of people and especially young people and especially people from the global south who are now focused on these topics and are trying to tackle them with fresh eyes. And so I'm optimistic that we'll be able to come to some kind of solution because of the fact that there are so many new minds working on this.

[39:40] - Speaker 1
All right, let's leave it there. Our guest has been Jillian York, activist, author, and the director for International Freedom of Expression at the Electronic Frontier Foundation. Jillian, thank you so much for taking the time to join us on Privacy as a new celebrity.

[39:57] - Speaker 2
Thank you for having me.

[40:01] - Speaker 1 
Privacy is a new celebrity. We'll be back again in a few weeks. For now, check out the complete archive of our episodes on mobilecoinradio. Com. That's also where you can find our radio show every Wednesday. I'm Brady Forest. Our producer is Sam Anderson, and our theme music was composed by David Westpal. Have a good week.