Privacy is the New Celebrity

Frances Haugen, Facebook's Most Outspoken Whistleblower, on How to Hold Big Tech Accountable - Ep 22

April 06, 2022
Privacy is the New Celebrity
Frances Haugen, Facebook's Most Outspoken Whistleblower, on How to Hold Big Tech Accountable - Ep 22
Show Notes Transcript

In this episode, Lucy Kind interviews Frances Haugen, a data scientist and privacy advocate who in 2021 famously disclosed internal Facebook documents exposing the company's prioritization of profit over the well being of its users. Frances' disclosures to the SEC and Congress led to multiple investigations in major newspapers as well as congressional hearings aimed at holding the tech giant accountable for elevating divisive content, spreading disinformation, and causing other forms of harm. Lucy asks Frances how she made the difficult choice to go public as a whistleblower, and how that choice transformed her career. Frances gives an insider perspective on how Facebook ignored its own internal findings that it was causing harm, choosing instead to manipulate public opinion and swerve accountability.  Lucy asks Frances to share her perspective on privacy, and Frances tells us about exciting new projects she's working on, including grassroots organizing at colleges and universities and creating more resources for whistleblowers.

Speaker 2 (00:09)
Hello and welcome back to Privacy is the New Celebrity. I'm Lucy Kind, and I'll be your host today. And today on the show, we are so thrilled to welcome the one and only Frances Haugen. Frances is the whistleblower behind the 2021 leak of Facebook documents that resulted in a nine part investigative series in The Wall Street Journal and was written up in several other major publications. The leak resulted in numerous congressional hearings and legal actions against Facebook, now Meta, forcing the company to confront its historic pattern of prioritizing profit over the well being of its users. Now, half a year later, Frances continues to advocate for transparency and accountability at Facebook and social media at large. Frances, thanks for joining us on Privacy is the New Celebrity. So 2021 was a huge year for you. After leaking the Facebook documents, you are testifying before Congress, giving talks around the world. So I want to start by asking, how did you decide to become a whistleblower?

Speaker 1 (01:08)
Back in the fall of our show was the winter. It was December 2020. Right after the election, Facebook made an announcement inside the company that they were dissolving civic integrity. So civic integrity was the part of the company that I worked in. It was the part of the company that was charged with making sure that Facebook was a positive force in our societies around the world. And you clearly got a sense that Facebook's perception was we didn't have blood on the streets of the United States. Therefore, our mission is accomplished. It was a real shock to everyone who worked in that space. So that's time I worked on counter espionage, and I previously worked on civic misinformation. And it was at that point that I realized that Facebook didn't really have the will to look itself in the mirror in a hard and honest way and make the changes that it and the world needed. If we wanted to make Facebook a product that was safe enough to be used by billions of people around the world, that we are going to need to get the public involved. And so I knew I needed to be careful.

Speaker 1 (02:23)
I'm a data scientist. I was working on catching people using our own products for espionage. And so it took me a long time to figure out exactly what I would do. But that was the moment where I had the realization that I was going to have to do something.

Speaker 2 (02:40)
Yes, well, we're so glad that you did and started this whole process of looking into what's going on. Originally, you leaked these documents anonymously, but then you decided to disclose your identity on national television. That seems like a really difficult choice to make. What influenced your decision, and why did you ultimately decide to make your name known?

Speaker 1 (02:59)
I'm a strong advocate for Privacy because most whistleblowers deeply need Privacy. I came forward because my lawyers were just really honest with me, and they said, the scope of the things that you are trying to get action on is quite broad. If you continue to do briefings with the government and explain what it is that is happening every single time you have one of those briefings, the circle of trust expands. And the reality is that security through an obscurity is only so safe. And what they said to me was, you really need to make a choice. Do you want to be completely private, or do you want to be completely public? Because otherwise you're going to get surprised one day, because the longer you are this shadowy whistleblower, the more people want to figure out who your identity is. And so it's safer just to come out and be open. I never planned on coming out. I intentionally worked quite closely with the journalist because I wanted to make sure someone could tell the story in a robust way. But it's been quite a roll wine since then. Wow.

Speaker 2 (04:14)
Yeah. Was it scary to do that?

Speaker 1 (04:18)
Yeah. It's one of those things where people talk about the idea of whistleblowing being a plan B, like whistleblowing is never anyone's plan B. It's like Plan E or F or H or I. Right. Because it's one of those things where I was always very happy to be behind the scenes person. I'm an algorithmic product manager. You don't have fully work on systems that no one even acknowledges exist for years. If you're not okay. Being behind the scenes, having to deal with that idea of being out in the open was a real struggle. And I feel incredibly fortunate that I got to live with my mother, who is an Episcopal priest, during Coban. So I moved there in March of 2020, and it gave me an opportunity to work through all those issues while I was still in the phase where I was terrified by what I was facing. So, yeah, it's the thing that actually has changed my life less than I expected would, which has been a blessing. But before it happened, when I was still living in that period of uncertainty, it was quite stressful.

Speaker 2 (05:29)
While kudos to you for going from private tech worker to full transparency and internationally known whistleblower. You said that you've started working for Facebook originally to create a, quote unquote, better, less toxic Facebook. But ultimately you did become a whistleblower, which implies that at some point a decision was made that improving Facebook from within was just not possible. Can you tell us how you came to that decision?

Speaker 1 (05:56)
I'm a strong proponent of the idea that we have to always look at the systems that cause problems, because if we treat just the symptoms as the problem, we may end up repeating situations that are bad for our society, but for our world over and over again, because we're not really going after the right thing. During my time at Facebook, I came to the realization that Facebook was facing incentives externally. Facebook was facing incentives internally, some of its own unintentional design that made it functionally impossible in my mind that they were going to be able to fix the problems they're facing on their own. Facebook has always fetishized this idea that if you just pick the right metrics, you can let people run wild and free. You don't need to have a human in the loop on decisions. And unfortunately, if the metrics themselves become a problem, and that's what happened to Facebook, they switched over to prioritizing the ability for somebody to elicit a reaction from you. When that happened, it ended up creating problems. That system that took humans out of the loop wasn't well poised to fix. And I learned this really wonderful saying from my last manager at Facebook, which was, we don't solve problems alone.

Speaker 1 (07:20)
We solve problems together. And I think a lot of the things that Facebook's been facing, they've been trying to solve problems alone. When they need to be solving problems in a larger sphere, they need help. We need to solve these problems together.

Speaker 2 (07:33)
Yeah, I like that. So tell us about some of the biggest problems that emerge from the Facebook documents.

Speaker 1 (07:39)
So last night, I gave testimony to the Australian Parliament, and one of the questions that got raised was about whether or not Facebook fact checking certain stories was a threat to our democracy. And part of what my ears always perk up when we get to talk about. What are the problems of Facebook? I said over and over again for at this point, like five months, the problems with Facebook are not about bad people or bad ideas. They are about systems of amplification that give the most reach to the most extreme ideas. So right now, Facebook knows it's known for at least since 2018, that changes it made to its algorithms to maximize profit have a side effect of giving the most reach to the most extreme ideas. This plays out in places like political ads. So an ad that elicits more reactions, it adds being higher quality by Facebook, and thus it's cheaper to run. And one of the consequences of that system is that hateful, angry, divisive, extreme ads are five to ten times cheaper to run than empathetic, compassionate, common ground building apps, because those ads don't really elicit a reaction. The fastest path to a click is anger.

Speaker 1 (09:10)
When I look at it in terms of the fundamental issue here is like, yes, that is a scary issue. There's other scary issues. Things like cartels and terrorists are using Facebook to expand and run their operations, or Facebook's AI systems cash single digit percentages of the content they claim they're protecting us from. But the real problem over and over again is this question about trade offs. Right. So when Facebook faces these problems, like with their system maximizing hateful ads, or how many people should they spend or invest in taking down human trafficking networks? Right. Facebook is over and over again because the only metrics they report publicly are profit and loss numbers. That Facebook, when it faces those trade offs, doesn't come down hard enough on the side of the public good. And as a result, their systems are running wild. And so the thing I've been advocating for is we need radically more transparency, that if Facebook had to report more numbers than just what its expenses were and how much profit it made, how many users were on the system, they would be able to optimize for long term success and not just short term success.

Speaker 2 (10:24)
Wow. Yeah. Crazy that you're trying to maximize the profit. And that happens. Going back to some of the violations that you exposed in the Facebook documents, there are a couple that really caught my eye. So there's a story about mental health and body image harm to teens on Facebook and Instagram. Can you talk a little bit about that and how we can ensure that those in a position of social power are behaving morally totally.

Speaker 1 (10:54)
So one of the things that happens with that engagement based ranking is that you can be interested in a moderate interest. So in the Facebook documents that give the example of you can take a brand new Instagram account, have no friends, no interests, and do some searches for healthy eating, healthy recipes. Because, let's be honest, we could all eat healthier than we do today. And just by clicking on the content that Facebook has provided you, within a couple of weeks of doing that, you begin getting shown content related to eating disorders. Facebook likes to say, it takes two to Tango. Literally. Their head of public policy or. Comms wrote an article last March saying, don't blame us for these algorithms. Like, it takes two to Tango. You chose your friends, you chose your interests. But the reality is that if you can take a blank account that has no friends, that has no interest, and just click on the content that Facebook gives you and have you get led from center left ideas to far left or center right to far right, from healthy eating to eating disorders. If the algorithms are working that way, that's not really too tangoing.

Speaker 1 (12:10)
Right. That's the algorithm is giving the most reach and pushing people towards the most extreme content.

Speaker 2 (12:18)
Yeah, that's wild. And what's even more wild is, given all this evidence being publicized, Facebook still brought back the number of likes being shown on each Instagram post. What do you think is the rational for that?

Speaker 1 (12:33)
So that's actually a really interesting topic with me. So one of the things that has been brought forward as a solution for some things like teen bullying, mental health issues is taking likes off of Instagram. One of the most important documents that came out of my disclosures was Facebook has been running experiments around taking likes off of Instagram for years, I think since 2018 as well. And it's interesting. Taking likes off of Instagram doesn't have an impact on social comparison. As long as you leave comments on teenagers still can see that Sally is prettier than them and she gets way more comments like teenagers aren't dumb. But the part that I find really important is that even though Facebook had this data, their internal recommendation was we should still launch it. We should push this out there because the media, government, academics, they all love this idea. So we should do it. And it illustrates how when they hold all the cards, like when they have the information and we can't confirm what they say, they're willing to lie. And even last summer, Facebook rolled out the option so you could choose, like, do I want to turn likes off?

Speaker 1 (13:52)
Even though they had this data, the United Kingdom had passed a law recently giving children more rights online. And one of the ways Facebook rolled out a package of changes and said, look, we take children seriously here. Look at one of them. I just feel like it's a lack of good faith, right? There are solutions we can be pursuing, and Facebook is unwilling to adopt any of those solutions if they cost them even slippers of profit.

Speaker 2 (14:23)
Well, thank you for sharing that with us all and bringing into the public sphere one of the Facebook violations that wasn't as picked up by the media here in the US is on the topic of human trafficking.

Speaker 1 (14:34)
Oh, yeah.

Speaker 2 (14:34)
Can you tell our audience about what you found?

Speaker 1 (14:37)
So most people are unaware of this, but in, I believe, 2019, the BBC did a thorough investigation around human trafficking on the Facebook platform and found that there were many examples of women being trafficked into the Middle East. So these are people coming from Bangladesh, Maids or Nigeria as house cleaners. And these were for outfits where it was known that these places were taking people's passports and basically holding them hostage or holding their pay and not giving it to them. And there have been reports from people who had been trafficked by these pages to Facebook saying, these people are traffickers and nothing had been done. And after the BBC reported on Facebook Meta's behavior, Apple stepped in and very quietly said, hey, Facebook, turns out human trafficking is a violation of the terms of our App Store. You need to get your house in order or we have to take you out of the App Store. And Facebook did an internal fire drill and took down a certain number of these pages, a couple of hundred. But they never disclosed to the public or their investors that this had happened, which is against the law.

Speaker 1 (15:56)
Like, you can't withhold information that relevant from investors, which is part of why we filed with the SEC.

Speaker 2 (16:03)
Oh, my gosh, it's so crazy when you find out about all the things that are in the shadows. So it's been about six months since those leaks first dropped aside from their name, do you think anything has changed at the company?

Speaker 1 (16:17)
It's interesting. We do know some things have changed, and I don't know if we can say for sure that they've changed for the better. So Facebook has made public commitments to invest 10,000 new software engineers in building video games, ie, the Metaverse. And they've tried to shift the dialogue. They don't want to have to do the clean up work of the mess they've made. They've made billions and billions, tens of billions of dollars off of building this global platform that puts people's lives at risk. And instead of staying and having a conversation about what is your role in inflaming communal violence, genocide in some of the most fragile places in the world, Facebook wants to go building video games. And I'm Super scared about this because the New York Times did an article about how inside of Facebook, engineers are sitting down for career planning conversations with their managers. And their managers are suggesting to them that they consider taking jobs in the virtual reality parts of the company, like the augmented reality, virtual reality parts. And the reason why this is so scary to me is for anyone who's ever worked in software knows engineering managers are rewarded for the impact that their teams drive.

Speaker 1 (17:40)
If your manager is actively saying, Please leave my team, they must be getting huge bonuses for every one of their engineers that transfers over, right? Facebook is actively hollowing out the parts of their company. The focus on social media to try to shift gears to the Metaverse. The problem is the harms that were outlined in my disclosures, they have been stopped, right? Teenagers are still hurting themselves. You're still seeing problems with human trafficking, with cartels harassing the villages near them, recruiting members, intimidating people to not cross lines, terrorism, recruiting happening. These problems are still going on. And Facebook just wants to change the conversation. They think they can change their name to Meta and leave it all behind. And we can't let them do that. We deserve better. People feel frustrated right now. People feel really angry. They feel really frustrated at Facebook. Why do people feel so frustrated? Why do they feel angry? And the reality is Facebook has been lying to us. Facebook has been lying to us for years. One of the things that has been the most heartening for me is I keep being reached out to by activists who say, thank you.

Speaker 1 (18:55)
We've been raising these issues for five years, for ten years. And Facebook told us to our faces, they weren't real. They said, people, this is a human trafficking thing. It's anecdotal like, yeah, you brought us a few examples. It's not widespread. But internally they had documentation that the problem is huge. I think it's really exciting to think about the idea of what if we had a different relationship with Facebook? What if we came and unpacked? How would we get there. If we were going to have a different what would a different relationship look like? And I think the core thing at the bottom of all the problems and the disclosures is the same problem over and over again, which is that Facebook knows that they can say whatever they want and no one's going to catch them. They can make any claim. And because no one else can view any data about how their system works, they can get away with it. And if you go and look through their transparency center, you really see this. Like, last year, they put out this thing called the most viewed content report. So the most viewed content report, I literally, when I read this for the first time, was like, oh, my God, I almost want to make a YouTube video that treats this as like a work of art and not as a statistical thing, because we could discuss, what are you trying to say with this versus what are you saying with this?

Speaker 1 (20:25)
Because, for example, their most viewed content report showed 20 pieces of content, all of which had been viewed by, I don't know, 40 million plus people, and they only showed it for the United States. There are a number of problems with this. So one, Facebook is overwhelmingly investing their misinformation budget in the United States. So if they had done the exact same report for almost any other country in the world, it would have been substantially more shocking, right? Every piece of content in the United States, that gets to be 10 million times a fact checker has had their eyes over it. But in most places in the world, or even worse, like in the most fragile places in the world, the content that gets the most distribution usually has graphic violence in it. It has a much, much higher rate of objectionable content. And Facebook had a pie chart at the top of this report where it had a tiny skinny slice that was 1% and then a big giant slice, which is 99%. And it said, look, these 20 posts, they're only 1% of all the views in the United States. And the part that I found so misleading about this is if they had shown us the top 100 pieces of content, it would have been two and a half or 3% of all the views in the United States.

Speaker 1 (21:42)
If they show us the top 1000 pieces of content, it'd probably be, I don't know, maybe 5% of the content, 4% of the content in the United States. And very rapidly, we would have been shocked with how much of our feeds is dominated by such a small number of pieces of content. And so it's one of those things where we imagine a world where instead of getting to consume whatever little scraps of information Facebook let us see, we actually got to see enough that we could have opinions about how Facebook operated and Facebook doesn't want that. They don't want to have to involve us in the process. And so the thing I'm excited about for 22 is I want to do a lot of organizing. I want to help organize young adults to resist Facebook, because the things that we need are things that I think everyone could get on board with. Transparency is really important. I think part of why Apple doesn't have Whistleblowers the way Facebook does is because Apple knows that it can be held accountable for almost every step in its production process. You can take an Apple phone apart and say, oh, the chip they said is in there, is actually in there.

Speaker 1 (22:54)
Or you can run a speed test on it and be able to do independent computations and verify the chip performs the way they say it does. But in Facebook's case, they've had too much temptation. They've been able to grade their own homework for so long that they rely on it now. And unfortunately, that means that they're under investment in really basic safety systems has played out in ways to harm us all.

Speaker 2 (23:19)
Yeah, totally. I mean, I love the direction that all this is going in. Is there any organizing activity happening right now that you are highlighted or involved with? Like, who's doing that work?

Speaker 1 (23:31)
So we're starting a College tour in over the next couple of months. So we're going to start meeting with College students and doing some need, finding exercises around. What resources would they need to begin doing more on campus organizing? We are working with NGOs that are excited about the accountability space, working with people who work with Whistleblowers. It's going to be an exciting year.

Speaker 2 (23:59)
Amazing. Yeah. It's definitely one of those things where when we educate the next generation, then that's how we get the ripple effects of change in culture. So really exciting to hear about all that work you all are doing. So today Facebook is called Meta, and they're shifting focus to virtual reality like you mentioned. What do you make of this shift? And do you have any concerns about what this new development means for Privacy?

Speaker 1 (24:22)
So Facebook is making this big shift towards video games. Mark Zuckerberg is currently envisioning this feature where all day we will spend strapped with a device on our faces where instead of hanging out in person with people or seeing people's faces, we'll interact with avatars of people in these virtual conference rooms. And part of what makes me so unnerved about this vision, because it's not just a recreational vision. It's not just about people playing fruit Ninja in their evenings. He wants employers to shift their work lives on to the Meta platform, the meta verse platform. And the reason why this comes, I just think it's very problematic is like, imagine your employer decides they want to become a better versed company. It's not just about now you have to sit and be a cartoon character every day. You have to put Facebook sensors into your home. Like, you can choose as an individual today to not use Facebook or not use Instagram to not install it on your phone. You can choose to not let Facebook have that access to your personal information. But as soon as employers start saying, oh, we're a Metaverse company, you end up in a situation where you have to choose between your livelihood and having Facebook's microphones, their bioinformatic sensors, things that do movement tracking.

Speaker 1 (25:53)
I think it's one of those things where we really need to think long and hard about given Facebook's track record of misrepresenting, what they do of not involving the public in basic safety trade offs, do we really want to let them hold people economically hostage to accept having these platforms in their homes? Right. I just think it's a discussion we should have before we barrel all in on it.

Speaker 2 (26:20)
So as you say with Meta, Facebook is building its own hardware. How do you think that will be an opportunity to expand their data collection?

Speaker 1 (26:31)
One of the things that's different between using a virtual reality system and say, just interacting with Slack or interacting with email is the amount of information that you pass over the Internet when you use email is substantially less than the information that you potentially pass over. When you have microphones in your home. You're going to have to have Facebook's microphones. Right now. You basically say, Do I trust Apple to have a microphone on my laptop? Do I trust Dell? Do I trust Compact? But now those companies don't have long track records over and over again of when they encounter a conflict between the public good and Privacy, public good and safety that they choose the right thing. Right. Facebook has a record of choosing the wrong thing over and over again. And we need to think about like, do we want to be vulnerable to a Corporation that has demonstrated it doesn't prioritize our wellbeing, to the same extent, yeah.

Speaker 2 (27:34)
So speaking of this track record, we all make decisions and we all have trust that we place in people based on history. How do you suggest we build trust with Facebook going forward?

Speaker 1 (27:46)
That's a great question. So I think one of the key things we need to do in order to rebuild trust with Facebook is we need to have ways of we as the public being able to surface questions or concerns and for Facebook to be required to give us information to demonstrate their progress on those issues. Right now, at any given point in time, there are large numbers of academics that are trying to get sometimes even very basic information out of Facebook. And Facebook right now doesn't grant those requests or like last year, they will grant a request and then give intentionally misleading information the way that we will get to a place where we believe Facebook is, where Facebook is transparent enough. And there's ways of being transparent that are Privacy sensitive. Right. So imagine instead of getting raw data, and I know there's lots of researchers out there that really want raw data. You've got aggregate data. It's very, very difficult to fake data if you have enough different kinds of aggregate data. Because if you try to lie, some other part of your data set will likely cash you in that lie. Facebook needs to be able to give up some of that control in saying what we are and aren't allowed to look at, because only by doing that will we be able to begin to believe that we understand what we're dealing with.

Speaker 1 (29:10)
And while we don't feel confident that we know what we're dealing with, we will continue running after ghosts. It makes sense that there are all these conspiracy theories about how Facebook operates, because over and over again, people find out the suspicions that they had ended up being true. And unfortunately, when you're in a loop like that, you begin to have Wilder and Wilder thoughts because there's no correction loop. And so if we can begin to have a process where there's a systematic way, where every six months, every year, we raise concerns with Facebook about potential harms. And guess what? There's going to be lots of new potential harms that are going to come out from the Metaverse and then be able to say, like, hey, we need at least this much data to know whether or not you're actually making progress towards that goal. Right? We deserve to have that level of transparency.

Speaker 2 (30:00)
Yeah, totally. And speaking of having that data, I know you stress the need for legislators to act on ensuring social justice and avoiding harm. With big tech such as urging senators to overhaul section 230, how might that change this behavior?

Speaker 1 (30:16)
So right now, Facebook has been using section 230 in ways that I think no one ever intended it to be used. So there have been conversations around whether or not intentional product choices from Facebook. For example, back in 2016, Facebook expanded from using just a like button so you can thumbs up something you could not thumbs up something to using a full palette of reactions. When Facebook made that change, they did it because they felt that people weren't reacting often enough to angry or sad topics. By making that change, they implicitly said, we want to show more angry and sad topics. So when you see things like an ever growing number of angry stories, it's not just a question about, like, there are problems with bias in the AI, but there's also intentional product choices Facebook has made that have led us in this direction. So Facebook has been brought to trial for conversations about choices like this the idea. So it's not about individual pieces of content, it's about choices. And Facebook has literally said to judges with a straight face, hey, we don't need to give you data about any of these things because section 230 gives us immunity on everything.

Speaker 1 (31:39)
I just don't think when we originally wrote about 230, that was our intention, like, our intention was to make it possible for businesses to host content from people. But I don't believe that Facebook should have blanket immunity when they are making choices about their algorithms, about the design of their products in general, things like mega size groups. They don't even have self moderation. Should Facebook have full immunity for those choices for things they do have 100% control over? I don't think we can continue to operate in an environment where they can do whatever they want and then just wave 230 around themselves.

Speaker 2 (32:16)
So I want to switch gears for a second and ask you a couple of questions that we like to ask all of our guests. The first one is, what is a Privacy technology that doesn't exist right now but should exist?

Speaker 1 (32:28)
I think most people don't understand the size of their information footprints. There should be much more accessible visualizations or other ways for people to understand what information they actually are leaking, because I think right now, like Apple does, it a little bit like they'll give you warnings about what information a single app discloses. And I think that's a good step in that direction. But I think most people, if they were aware of just the sheer amount of information that they were often sharing with people they probably wouldn't consciously share with, I think people would make very different choices.

Speaker 2 (33:06)
Are there any developments in terms of Privacy enhancing technology or regulations that you're excited about right now?

Speaker 1 (33:13)
I think the fact that we're having conversations about what information can be used in targeted ads, I think it's really constructive. There's been little approaches on this one where people will come in and pass law saying, oh, you can't use my political affinity or you can't use certain characteristics about me to target ads to me. But the reality is this is a really complicated topic. I don't know if I come down all the way on the side of saying you shouldn't have targeted ads. I think it's a very complicated question. But I do think it's great that we're having conversations, especially public conversations, about what is the role of using our data, like, how much do people actually consent to the use of that data? And so I think that is a great leap forward that we're making right now, even though I don't know where I come down on the end.

Speaker 2 (34:02)
Yeah, I think that word you mentioned, consent, is key, at least for me. Do you know how your data is being used like you mentioned before? Do you know how big your digital footprint is or who is seeing it right now as you're stepping through the Internet? So super happy to be having these conversations as well. And from my point of view, agency and consent is at the heart of it all.

Speaker 1 (34:25)
Oh, totally, 100%. Yeah. I think there's a real need to do more Privacy education earlier. I have a friend who has a nonprofit that goes and educates people on the importance of owning your data. People are routinely shocked by how their data is used and how much of it exists. And I totally agree with you, Lucy, that people have a right to make decisions about that. And I don't think we've invested enough in helping people to understand the consequences or to even be aware that they might be in harm's way.

Speaker 2 (35:00)
Well, I know last year you invested your time and energy in finishing your first tour of Europe. How did that go? And how do you think that your Privacy climate compares to that of the US?

Speaker 1 (35:11)
Oh, it's so interesting. I learned a lot last year around different philosophies on writing laws. So the United States is much more oriented towards rules based laws, and Europe is much more oriented towards principles based laws based on the idea that companies are very good at precisely dancing around things like if you say thou shalt not do X, they'll figure out X Prime. That is very close to X, but is not technically X. Right. Europe has been willing to take a more definite stand in terms of things like GDPR, but they've also learned challenges. They've struggled a lot with actually enforcing GDPR or being able to respond to violations that have arisen from GDPR.

Speaker 2 (36:01)
Yeah, I love it. Avoiding the whole land of loopholes situation and just really getting down to brass tacks with principles. So overall, are you optimistic about our future with Privacy? I know we're having a lot of changes happening with Meta and just digital Privacy in general. What are your feelings about what's to come?

Speaker 1 (36:22)
I've said many times before, I don't think Facebook will change until its incentives change. And so we need to think about how do we want to change our relationship with Facebook if we care about our Privacy and our ability to make choices about consenting to use Facebook's products, we need to really step up and say like, hey, it's not okay to mandate employees have to use Privacy violating technologies in their homes, because the reality is now many of us work from our homes and it's going to be that way for a long time. If we don't have those kinds of employment protections, people will be forced to get on Facebook's metaphors bandwagon whether or not they want to or not. Apple has made some amazing steps forward. They've shown that you can have a business that is successful with Privacy as a feature. I think there is a growing awareness in younger generations that this is a desirable thing. I think when we look at this on a long enough time, Mark, I'm very optimistic about it. It will be interesting to see how things evolve over the next few years.

Speaker 2 (37:25)
Yeah. Looking forward to seeing how it goes and really thankful to have people like you and other activists pioneering the way we've been speaking with Francis Haugen, data scientist, Facebook whistleblower and Privacy advocate. Francis, I'm so glad you took the time to join us on Privacy is the new celebrity.

Speaker 1 (37:51)
Thank you for hire me.

Speaker 2 (37:54)
That's it for now. Don't forget to subscribe to Privacy is the new celebrity. Wherever you listen to podcasts and check out mobileclaimradio.com to listen to the full archive of podcast episodes and tune into our radio show every Wednesday at 06:00 p.m.. Pacific time. I'm Lucy Kind. Our producer is Sam Anderson and our theme music was composed by David Westbaum and as we like to say at MobileCoin, Privacy is a choice we deserve. Bye.