In episode 16, Lucy Kind interviews Eva Galperin, the director of cybersecurity at the Electronic Frontier Foundation, an international non-profit digital rights group that promotes online civil liberties. Eva leads EFF's Threat Lab, where one of her focuses includes eradicating "stalkerware" - spyware used for domestic abuse. Eva explains how private companies are profiting by selling malware that allows abusers to track everything a victim does on their device. She breaks down how these virtual risks lead to real-world harms, and why it's so important to design technology for the most marginalized members of society.
Speaker 2 (00:01)
Hello, and welcome back to Privacy is the new Celebrity. I'm Lucy Kind, and I'll be your host today. As for today's guest, I'm excited to welcome Eva Galperin. Eva is the director of cybersecurity at the Electronic Frontier Foundation, an international nonprofit digital rights group that promotes online civil liberties. At EFS, she leads the Threat Lab, where one of her focuses include eradicating stalkerware, which is spyware used for domestic abuse. Eva, thanks so much for joining us on Privacy is the new Celebrity.
Speaker 1 (00:34)
Thanks for having me.
Speaker 2 (00:36)
So tell us about the Electronic Frontier Foundation. What sort of work does your organization do?
Speaker 1 (00:43)
Well, the Electronic Frontier Foundation is a digital civil liberties organization. So our mission is to make sure that when you go online, your rights come with you. We've been around since 1990, and we do sort of a combination of three things. We do impact litigation. So we have a bunch of lawyers on staff, and we do the sort of work that most people associate with the ACLU. So we file lawsuits that we think are going to result in better rules for the Internet. We also do activism. So get out on the streets, go call your congressman. Go send a letter to the editor when something especially stupid happens on the Internet or when governments or law enforcement are abusing their powers or when somebody proposes a rule that we think is really going to do great harm to civil liberties online. And then last of all, we have a sort of Department of engineers, and we work on engineering projects such as you may be familiar with let's Encrypt, which gives everybody the ability to create SSL certificates with just a couple of clicks for free, thus helping to encrypt the Web. We also work on a whole bunch of other projects.
Speaker 1 (02:11)
And my particular team has done some tracking of APts. And also we're involved in really interesting lawsuit that we have just filed on behalf of a Saudi activist whose phone was being spied on by a company called Dark Matter located in the UAE.
Speaker 2 (02:30)
Wow. Yeah. A wide scope all around. A very impactful work going on that last point. You've been known for fighting against stalkerware. Can you explain what stalkerware is and why it's so dangerous? So our listeners can understand?
Speaker 1 (02:44)
Sure. So stalkerware is the entire class of commercial software which is available to everyone, and it is designed to be covertly installed on somebody else's device in order to spy on what they're doing on that device. So spy on their phone calls, on their keystrokes, on their passwords, their photos, your end encrypted messages, your banking, all of that gets exfiltrated out to a sort of party to a server which is usually run by the company, which sells the stockerware. And this is tremendously invasive. And it is really part of a whole constellation of behaviors that make up tech enabled abuse.
Speaker 2 (03:40)
Oh, yeah. There's a lot at risk there for victims of this type of malware. Can you tell us about your work that you've done directly with the victims of this technology?
Speaker 1 (03:50)
Sure. So since about 2018, I've been working directly with survivors of tech enabled abuse. They come to me with their devices and with their stories. And sometimes when I am able to, I help them to sort of lock things down or to find out where their information is leaking from. I did so much of this work that I decided that simply taking these problems on one at a time was not enough. And so I helped to found an organization called the Coalition Against Stalkerware. And we work to sort of punch above our weight. We work to increase understanding of stockerware detection of stalkerware by AV products. And also we work with Google and Apple and other tech companies in order to detect this kind of Stocker when it comes up in their stores because it's already in violation of their terms of service so that they can take it down.
Speaker 2 (04:57)
Well, yeah. That's so great to have scalable solutions for these sorts of problems. Can you give us an example of the sort of real world harm that this can cause?
Speaker 1 (05:06)
Well, I very regularly have approached by people who either suspect that they have stalkerware that has been installed on their phone or on a device which is in their home. That is a thing which happens a lot. Most recently, I spoke to a survivor of tech enabled abuse who is in the middle of a divorce and therefore sharing custody with her abuser. She is sharing custody of her two children, and the abuser puts on an iPad belonging to one of the kids and used it to track her location and to show up in places where she was going to be that she had not told him about.
Speaker 2 (05:58)
Oh, wow. Just like loss of agency and loss of Privacy all around crazy.
Speaker 1 (06:02)
Oh, she was terrified.
Speaker 2 (06:03)
Oh, man. Yeah. Can you give us a sense of the scale of this problem? Like, how common is it to be infected with stalkerware?
Speaker 1 (06:11)
Well, we don't know. What I can tell you is that when we have found examples of stalkerware apps in stores that they get downloaded hundreds of thousands of times in a very short period of time, not every single one of these downloads translates to an active infection or active stocking, but certainly I think we can safely assume that some of them do. Furthermore, when companies have carried out research into people's attitudes around spying on their partner's phone, we've discovered that about 10% of the people who were being surveyed will just freely admit to using stalkerware in order to spy on their partners phone if they feel sufficiently justified in doing so.
Speaker 2 (07:08)
That's necessarily high number.
Speaker 1 (07:10)
Well, the best parallel is sexual assault, in that there are actually many studies which show that rapists will essentially admit to raping people left and right if you simply ask them, have you ever had sex with a partner without their consent or while they were asleep or while they were inebriated? And as long as you don't use the word rape, they'll be like, oh yeah, that totally did that. And that is something that we see in surveys among people asking them about stalkerware as well. If you don't call it stalkerware and you just say like, have you installed something or covertly monitored your partner's phone? Often they will just be like, Well, yeah, of course, this is totally normal and justified.
Speaker 2 (08:02)
Wow. You just change the language and all of a sudden people's true colors come out and these apps that enable potential stocking, are they legal?
Speaker 1 (08:13)
It depends. The apps themselves are not legal if it is very clear that they are being sold for the purpose of breaking the law, primarily for the purpose of breaking the law. And that's one of the reasons why a lot of these companies have changed the way in which they advertise their services. They used to basically say, Catch your cheating spouse, and now they say things like, Keep an eye on your kids or your employees. In that way, they skirt the law while also making it very clear that their product is capable of and is designed to covertly be installed on someone else's device and to tell you what is happening on that device. Furthermore, if you are a person who has bought the stalkerware and you install it on somebody else's device and you use it to listen to their phone calls or log with their keystrokes, if you use that knowledge to log into one of their accounts, you are definitely violating the law. If you are listening in on the phone calls, you are violating the wiretapping statute. If you are using those passwords in order to log into their accounts, you're violating the CFAA.
Speaker 1 (09:36)
If you are tracking their location and then showing up where they are, you are violating stocking statutes wherever you are. So not only is the use of these things illegal and in fact, even installing them, I think is a violation of the CFAA, but much of the time, the people who are using these tools simply don't face any consequences. And I'm trying to change that.
Speaker 2 (10:07)
So what about Google or Apple or the other tech companies than about this?
Speaker 1 (10:12)
Well, both Google and Apple have banned software from their stores. Apps which meet the definition that I have just given you, are against the terms of service in their store and should be taken down or should not be able to get in at all. Having said that, neither Google nor Apple does a particularly great job of policing their store so you can find them there. And even in the cases where you cannot find them there, frequently, what these apps will do is they will simply use SEO so that if somebody searches for how do I spy on my partner's iPhone, they will be taken to a website, and then you can download the APK directly off of the website onto the device. And so they don't even need the stores in order to make this happen.
Speaker 2 (11:06)
Well, I'm glad that you are raising awareness about this, and you first started raising awareness about stock aware a few years back. Has there been any progress to eliminate this technology since then?
Speaker 1 (11:17)
Yes and no. It's actually a little bit hard to tell because dual use apps often get lumped in with the sort of strictly defined stalkerware apps. And so it's kind of hard to track exactly how many infections we're looking at because everybody sort of counts them differently. And that's one of the reasons why I've been working with the AV companies. They have done a much better job of detection since 2018. We are absolutely certain that detection has improved, but what we are not certain of is the exact numbers of stock aware and dual use apps that we're seeing out there. And part of that is just that there is not a consistent use of the various definitions when they're tracking these apps. So the different AV companies and different security companies will report that they have found this much stalkerware or there have been this many detections. But because they all use slightly different definitions for what they're looking for and what they're detecting, it's like comparing apples to oranges. On the surface, it looks like a lot is getting done, but if you look a little bit deeper, it's that sort of usual problem that you have with academia and research, which is that people are simply not using the same definitions.
Speaker 1 (12:40)
And when they don't use the same definitions, then we can't draw any broader conclusions about what's happening out there in the world. What we do know is that starting with the first COVID lockdowns in 2020, we saw an increase in calls to domestic abuse hotlines and generally an increase in domestic abuse. And that includes tech enabled abuse. Nearly every person who comes to a shelter or who reaches out to a domestic abuse hotline reports that they have also been a victim of tech enabled abuse. And stalkerware is sort of a small subset of that. So we know that this is a problem, and we know that it's a problem that's gotten worse since everybody has been sort of locked indoors together for two years. And survivors of abuse are frequently either locked in with their abusers or locked away from their abusers. And so their abusers are particularly interested in seeing what is on their devices so that they can maintain control.
Speaker 2 (13:50)
Wow. I'm so glad that we're having this open conversation about tech enabled abuse. I just feel like since this whole technology expansion has happened so quickly, it kind of slips through the radar. We don't really think about that in the class of abuse. So it's so great to be able to just, like, surface this as a topic and as an issue. What about protections? What can individual people do to reduce the risk of falling victim to this sort of malware?
Speaker 1 (14:17)
Well, I think that putting the onus on victims and survivors to protect themselves is a bit unfair. Frequently, part of the way that abuse works is that the abuser forces their victim to give up custody of their phone while they install something on it or to simply hand over their password. The danger that the victim is in at the time when they're simply trying to escape the physical presence of their abuser is tremendous. And so, again, leaving this up to the victims is misunderstanding the dynamics of abuse. The good news is that there are more and more tools that survivors can use in order to lock down their devices and to lock down their accounts once they have gotten away. And if they feel that it's safe to do so. Sometimes if you are being surveilled by an abuser, cutting off their access may lead the abuser to escalate their abuse, including escalation to violence. And that's one of the reasons why we don't automatically do this for the survivors. It's really up to the survivors to sort of think about what their own appetite for risk is, because they are the person who knows their abuser.
Speaker 1 (15:42)
And I would not presume to know better than they do. So I'm very much about putting the power in the hands of survivors. So there are a couple of things that they can do is they can lock down their accounts, they can change their passwords, use a password manager, make sure every account has a unique password, has the highest level of two FA that they're comfortable using turned on, and also they can install antivirus software both on their computers. And if they're using Android phones on their Androids and they can run a scan. And I've worked enough with the AV companies that the AV products have gotten much better at detecting stalkerware when they find it. If your abuser has had physical access to your phone, one of the other things that I would check for is I would check your AV product and see whether or not the detection for certain programs has been turned off, because that's one of the things that the stalkerware vendors advise abusers to do. Now as sort of part of the process of spying on people.
Speaker 2 (16:56)
Yes. I've read that you've been part of the push to have the owners be more on the security companies to help detect this sort of stuff. Do you think big tech should be responsible for solving this problem? Maybe it's a yes. And how do you think we can, as a society, go about reducing the risk of malware and stockerware?
Speaker 1 (17:17)
Well, I don't think it's just one person's job or even just one industry's job. I think this is one of those things where we all have to work together. But there's definitely a lot more that the information security community could be doing in an even broader sense, beyond information security and detection of malware. We really need to change the way that we talk about tech enabled abuse and really name it as abuse. One of the big problems that I have seen in information security is that people who work in information security are not infrequently the abusers that the people who come to me are running away from. And part of that comes from this tremendous sense of entitlement that we have in information security that I am the person who has root. I am the person who is entitled to see everything and to control everything. And when I do it, it's just fine. And that's simply not the case. And I think it's really time for people to call each other out when they see this sort of behavior happening.
Speaker 2 (18:25)
And at the risk of saying something sort of edgy, how do you feel when the person or entity that's doing that is, say, a society or government not a big fan of government spying?
Speaker 1 (18:40)
Not really into it. I spent a bunch of time tracking APts, specifically governments who were using software in order to spy on dissidents and activists in order to carry out human rights violations. And again, this is exactly the same dynamic that you see in an abusive relationship. These are the people in power abusing their power to use technology to covertly spy on the people who are trying to hold them accountable.
Speaker 2 (19:12)
Yeah. And sometimes they're putting on the Cape of the white Knight. Right. They say we're protecting you or something to justify it. But yeah, I see a lot of parallels and sounds like so do you. So you've also developed tools for helping vulnerable populations, like the digital first aid kit. Can you tell us about that?
Speaker 1 (19:32)
Sure. So among other things, I did spend several years traveling all over the world back when that was possible, working with people in vulnerable populations, including activists and journalists. And I used that knowledge in order to help build a number of security guides. And that includes surveillance selfdefense at SSD Eff.org and our guide for people who are interested in teaching security to vulnerable populations. And that's called the Security Education Companion, and you can get to it at SEC efs.org. So that's sort of the distillation of the knowledge that I picked up over the years, trying to understand how to best teach Privacy and security concepts to people who are who are in danger and who have better things to do than to become information security professionals on the fly.
Speaker 2 (20:35)
Nice. Are there any other scalable solutions that you think should be implemented, either ones that exist or maybe ones that you wish existed?
Speaker 1 (20:44)
Well, there are a couple of things that we've been doing. We've spent the last five years pushing for ubiquitous encryption of the Web. It was really common for governments and hackers to man in the middle connections in order to spy on people's web traffic until like, four, five, six years ago. And we solved that problem. I think that's pretty cool. Additionally, the ubiquity of two factor authentication. I mean, we still have a very long way to go because I think that even though most of the major platforms now provide some sort of two factor authentication that you can use to lock down your account, a lot of them still rely on text messaging, which is not particularly secure. I mean, I'll take it over nothing, but in general, I would really prefer that we used, like, app based or kind of a dongle based system. And also these things have become much more common and used by ordinary people in a way that they simply weren't four or five years ago. So I think that we have seen some improvements. And the way forward for those improvements is to improve security for everybody, because if you improve security for everybody, you will also improve security for vulnerable populations.
Speaker 2 (22:11)
Yeah, definitely. Our motto at Mobile Coin is we think Privacy is a human right. So super agree that it should be for everybody. But to change things up a bit, how do you like to spend your time when you're not fighting for our individual right to Privacy? I read in one of your BIOS that you're a silk aerialist.
Speaker 1 (22:31)
Yeah. Right now I actually just stopped going back to my circus gym because of the new spike in Covet, and I miss it very much. But basically my hobby is aerial circus arts, because when you are 30ft up in the air and upside down and spinning, you absolutely cannot think about human rights.
Speaker 2 (22:55)
Well, it shows how much you care that you have to be upside down, spinning in the air, 30ft up in order to disconnect from your passion. So clearly you care a lot about Privacy, which is a good segue into one of our questions that we ask a lot of our guests. When did Privacy first become important to you?
Speaker 1 (23:15)
Well, that's sort of an easy one. I came to the United States from the USSR as a kid, and I grew up on stories of what it was like to live under authoritarianism and what it was like not to have Privacy and to expect that everyone you talk to is going to snitch you out to the government, like not even digital spying. Just plain you can't trust anybody within earshot. And there's a sort of cultural thing from the Soviet Union, which is that essentially the only safe place to error your views is inside of your apartment at your kitchen table. And that was really drilled into me as a kid that we escaped from a place where there was no Privacy and where you could not trust anyone and you couldn't err your views, and that we had come to a place where we could. And so it's always been really important to me to protect.
Speaker 2 (24:21)
That nice. Yeah, well, definitely. Freedom of speech is something that I think everybody deserves. So glad that you're fighting the good fight. And on that point, you lead the threat lab at EFS. Apart from stalkerware, what do you think are some other important threats to Privacy that folks should know about?
Speaker 1 (24:40)
Well, I think that the biggest upcoming threat to Privacy is one that just comes up over and over again in my work, which is that governments and law enforcement keep insisting that they need to backdoor our end to end encrypted communications. For many years, they insisted that they needed to do this in order to fight terrorism. And then when they sort of failed to make headway with that, they are now making the argument that they need to backdoor our end encrypted communications in order to save children from abuse and crack down on child porn. And in both cases, this is a really insidious argument because if you say, hey, Privacy is important, security is important, anonymous speech is important. The very first reply you will get is, why do you love terrorists so much? Why are you supporting child porn? Why do you love child abuse? Why are you defending these undefendable things? But Privacy is for everyone, and security is for everyone, and you cannot create a back door that only the good people can walk through, even if you trust the police, even if you trust governments. One of the things that we've really seen over the last several years is that if you have a weakness in your system or a backdoor, it's not just going to be used by the people that it was designed for.
Speaker 1 (26:12)
Other governments will show up, hackers will show up, malicious parties will show up, and they will use it for things that you do not like. So it's very important not to build the door in the first place.
Speaker 2 (26:23)
Yes. And you mentioned that EFF is engaged in filing lawsuits as a form of making this change. Are there any recent actions that you think are relevant to this conversation?
Speaker 1 (26:33)
Well, the one that I'm most excited about is El Hoflow, which is a case that we filed on behalf of a Saudi activist who was spied on using software from a company called Dark Matter in UAE. And what they did was they employed software. They employed a zero click vulnerability to spy on the end to end encrypted messages on her phone, and as a result, she was arrested and tortured. So this is one of those cases where there is a very real harm resulting from the actions of a tech company in the UAE that was violating the law in the United States because they were violating the law in the United States because they were spying on Imessage. Basically, they violated the CFAA. So this is the argument that we are making, and I think given the uptick in cyber mercenaries, companies that sell spying technology, especially technology that allows you to spy on mobile phones, because the kind of information that you can get from a mobile phone is incredibly useful to both abusers and to governments, especially authoritarian governments. Since we've seen this sort of uptick in cyber mercenaries, it's really important to send a message to those cyber mercenaries that if you do this, you will be sued, you will pay, you are violating people's human rights, you are enabling tremendous harms.
Speaker 1 (28:25)
And so far, it's been really difficult to bring these companies to heal.
Speaker 2 (28:30)
Oh, man. Yeah. Well, I'm glad that the EFF is there to try to help and to get these companies to at least hopefully to disenable them from doing this sort of harm. When it comes to individual Privacy, what are some best practices you can share with our listeners?
Speaker 1 (28:49)
Well, again, the most important things that I think people can do are just your extremely basic digital hygiene. You wash your hands of the Internet, so use a password manager, have unique, strong passwords, use the highest level of two FA that you're comfortable using on each of your accounts and take your security updates. Yeah, this is probably the best advice that I could give, especially given that most of the compromises that I see are not, in fact, covert installation of stock or where the most common form of compromise that I see is account compromise. And as long as our accounts are not secure, we are not secure.
Speaker 2 (29:40)
Who do you most look up to in the field of Privacy and technology?
Speaker 1 (29:45)
I actually have a lot of heroes in this area. Oh, yeah, I've got a long list.
Speaker 2 (29:52)
Okay, let's hear it.
Speaker 1 (29:54)
But right now, the first person that comes to mind is Sasha Castanza Chalk, and she wrote a book. She's a PhD, and she did a bunch of work at Berkeley Incline. But she wrote a book called Design Justice, which I think is extremely important, which does a lot to inform my work at EFS, and that she makes the argument that it's really important if we're going to be making products, if we're going to be making services, if we're going to have a tech industry that does stuff that we need to design things with marginalized populations at the center of our design process. And the reason for that is because if you design in order to protect your most marginalized and vulnerable groups, then you will create products that protect everybody. But if you design only for people who are like yourself, then you're going to leave the marginalized at the margins. And one of the reasons why I love technology so much and I've really devoted so much of my life to it is because I still believe in the power of technology to allow marginalized voices to be heard, to speak truth, to power, to fight inequality.
Speaker 1 (31:18)
And instead, what we've really seen over. I would say the last ten years with the kind of rise of platforms, is that because our tech platforms did not think about marginalized people, all that they really did was kind of recreate all of the inequalities that exist in the world, only now they exist online, and in some cases, they've even amplified them.
Speaker 2 (31:47)
Yeah. I'm optimistic about this notion of tech as the great equalizer, and it really resonates with the value we were talking about before. Privacy is for everyone.
Speaker 1 (31:59)
Yes. But it's really important not to be sort of wide eyed and utopian about it in the sense that we need to look at the ways in which tech has really failed us in the last decade or so, and it has absolutely failed to deliver on the promise of equality and allowing us to speak truth to power. And unless we can come to terms with the ways in which those failures happened, all we're going to do is recreate them with a whole new wave of products and services.
Speaker 2 (32:33)
Yeah, that's fair. Are there other companies out there doing similar work to EFS that you look up to?
Speaker 1 (32:39)
Well, there are lots of organizations doing lots of things. I recommend the ACLU, if you like yourself, some lawsuits, if you enjoy security research, might I recommend Citizen Lab over at the University of Toronto. And if you are interested in protecting survivors of tech enabled abuse and domestic abuse, I recommend Nnedb, which is the national network to end domestic violence or Operation Safe Escape.
Speaker 2 (33:17)
Yeah, all good recommendations. Another question we like to ask each of our guests is what is a Privacy technology that does not yet exist right now but should exist for all our tech enthusiasts out there who want to help make that change?
Speaker 1 (33:32)
This is actually kind of a tough one because I think that the biggest problem is not that the technologies don't exist, it's that often they're simply not being used or they exist along with a set of policies that make them largely unusable, which I think is really unfortunate. What I would really like to see in this world is just a wider variety of platforms. I think that the fact that power has really concentrated in the hands of a few large tech companies is very dangerous and concerning. And this is one of the reasons why I keep an eye on the new tech companies so carefully, because I think that they're the future, and a world in which the Internet is just Google and Facebook is really not a future that I'm interested in fighting for.
Speaker 2 (34:36)
Oh, well, when it comes to the future, though, are you optimistic about the direction we're moving in? Do you think Privacy has improved for users of technology, or are things getting worse?
Speaker 1 (34:46)
Well, I hedge my bets. I prepare for the worst, and I hope for the best. It's the only way to get up in the morning and keep fighting. So, yeah, in some ways things have definitely gotten worse for people, but I am still optimistic about the future and I think that we're really starting to see a movement towards centering the most marginalized voices and towards centering them both in our design process and in our community and when creating new tools. And I think that's really essential if we want to build an internet that's better than the one we have right now.
Speaker 2 (35:33)
Yeah. Optimism and realism can go hand in hand. Beautiful. That's it for today's episode, our guest has been Eva Galperin, the director of cybersecurity at the Electronic Frontier Foundation. Eva, thanks so much for coming on the show.
Speaker 1 (35:50)
It's my pleasure.
Speaker 2 (35:53)
That's it for now. Please subscribe to Privacy is the new celebrity on all podcast platforms and visit mobilecoinradio.com to listen to the full archive of podcast episodes and tune into our radio show every Wednesday at 06:00 p.m. Fixed time. I'm Lucy Kyne. Our producer is Sam Anderson and our theme music was composed by David Westbomb and as we like to say at mobile coins, Privacy is a choice we deserve.