Tech'ed Up

Battling Disinformation • Camille François

November 18, 2021 bWitched Media
Tech'ed Up
Battling Disinformation • Camille François
Show Notes Transcript

Disinformation expert Camille François sits down in the studio with Niki to explain what a troll farm is, how they operate, and the business of building fake online influencers. Plus, what's the difference between misinformation and disinformation? It's a game of cat and mouse between bad actors and teams trying to make social media safer.

"How the hell are you going to know whether something that's fake has been propagated by someone who's there trying to deceive you, or whether it's your uncle again, who really should get off Facebook?" -Camille François 

Intro:

[music plays]

Niki: I’m Niki Christoff, and welcome to Tech’ed Up. Today, I’m joined in the studio by Camille François, a disinformation researcher. By the end of this episode, you will know what a troll farm is, how they operate, the difference between a bot and a hyperactive political supporter. We also talk about how people sometimes unintentionally end up in harassment campaigns. It turns out that...unless they’re a bot... trolls are human too. 

[music plays]

Transcript:

Niki: Today in the studio, we have Camille François visiting. She is a professor. She has studied misinformation, disinformation, online harassment, and is joining us in Washington DC. Last week when I texted you and asked you if you could come on, you were in Paris at the presidential palace [Camille: laughs], having a meeting with President Macron, and now you're on Tech’ed Up. Welcome to the pod.

Camille: Thank you for having me.

Niki: Thank you for coming. So I want to talk to you today, about you've done a lot of work on election misinformation and disinformation, and people have heard a lot about it. But I think people don't understand trolls, troll farms, bots, how these campaigns get started, how people end up working in them. And so that's what I'd like to discuss today, but let's start with- your first work in [pause] harassment. How did you get into this field? 

Camille: Yeah. Um, I think I stumbled upon the topic, if I'm going to be honest. I had always been working on cybersecurity issues and specifically how governments use digital means to sensor, to harass and to silence.  Back in the days I was working at Google’ and I was working, um, with activists and journalists, thinking about the ways in which they were currently being silenced and targeted by specifically their governments. Right? And so, they were talking about being, y’know, being hacked, and being fished, and very quickly that question of the trolls emerged. 

And they were saying: “We now know how to protect ourselves against fishing, right? We have a better sense of how to do cybersecurity, we feel a little bit better about not being hacked. We have some notion of making sure our privacy gets respected but we want to talk about the trolls and the, y’know, the fake accounts that are targeting us on the internet.  They’re, they're trying to silence us. They are creating harassment campaigns, and we think those accounts are fake. And we think that governments are behind it.”

Niki: So you, you once told me that your work was inspired by it, and you just said this, but I'm going to repeat it. [both laugh] But it was inspired by activists and people speaking out online who felt like once they became the target of attention, it felt like, I think you used the words, the internet was crashing down around them. [Camille: yeah] And so your research was to show them this isn’t, however many thousands of people, these are fake accounts. Like, to help them see that this feels like the internet is crashing down around you, but, actually it's, it’s coordinated. Am I getting that right?

Camille: Yeah, it's so important.  I remember some of the interviews we did with, specifically a woman journalist in Turkey, and it was sitting in- she showing me all the fake accounts and all the messages and, frankly, there were sexual threats. Right? And so, she was really thinking: does, do that many people actually have these feelings of, of, violence towards me? Is this, am I, really facing a threat of millions of people on the internet, like, going after me in this way?  

And being able to see, like, “Hey, actually, those are not only just fake accounts, but they're all coordinated by just a handful of entities” does provide very important context saying like, it is not the case that thousands and thousands of people, y’know, like think of you in this way. Really what it is, is a coordinated campaign that strategically deployed against you to silence you, to threaten you and, and for you to live in fear. 

That can help a lot, and then comes the question of, can we attribute it? Right? That's a question that we talk about a lot in cyber- when there's a cyberattack, we don't just say, “Oh, that's a cyberattack.” [chuckles] Right?  People want to know. Okay. All right, but who's behind it. [Niki: Right] And building this method to do attribution for this campaign was also very important, right? Like when I tell you, “Hey, you're being attacked by a troll form online”, it's legitimate for you to say, “Well, I would like to know who's behind it now.”  [both laugh]

Niki: Exactly! Who's running the troll farm? [Camille: Exactly] So one thing I don't think people know, I'm putting myself in this category too [Camille: chuckles]. I get the gist of what a troll farm is. It's internet trolls working together to do you, up to nefarious things, but I don't really understand what a troll farm is and how they operate. Are they humans? Are they bots?

Camille: Yeah, that's a great question. And the answer is not easy, actually. It doesn't really mean, uh, something specific. I think it's a term that we, uh, y’know, started using really after 2016. Um, the original troll farm that people think about is the internet research agency. It's that troll farm based in St. Petersburg who was responsible for a large part of the Russian disinformation campaign that targeted the 2016 election. 

Now, if you look at the IRA. Um, we know who are they, and what is its structure.  We know that now because they've been indicted. So, we have a lot of legal documents and, essentially, they are, sort of, closer to a marketing shop, really, than, than, than to an intelligence agency [Niki: oh, ok]. So if you look at, for instance, the teams they have, they will have a graphic design department, they will have the SEO department, and, of course, they work to achieve a geopolitical ambition. Right? But if you think about- again, if you close your eyes and you try to imagine it, you're closer to the world of, like, a little marketing shop than to the world of James Bond, really. 

Niki: So, that is fascinating! [Camille: laughs] So, SEO for people- so search engine optimization. So, what you're saying, is they have graphic designers and people in-house, marketing the message they want to get out. It. Rising to the top of search results, rising in these social media platforms. 

Camille: That's true. And the other things that they would do, that's very familiar to us,  is they would do AB testing, right? So, they would sit, and they would say, “Today, we have to get out of this very nasty message, uh, on Hillary Clinton. Please use your fake accounts and the fake groups that we have”, and then they iterate. And then at the end of the day, they, they regroup. And they're like, “Oh, well, that really didn't work. That was a very bad joke, and nobody liked it, but, whoever did this post, it's actually pretty good. Let's double down in this direction and let's continue engaging our audience.” Right? They're doing audience building. When you have a fake account, when you do, like, a fake profile, the first thing you want is for your fake account to find its audience. 

And so it's almost like a little TV show, right? So you have this fake identity, and you have to think what are they going to say today to be engaging? One of my favorite trolls from that era is Jenna Abrams. Uh, she was evidently, like, a fake person, right? [Niki: Ok] And she was written by the internet research agency [Niki: by the Russians] in St. Petersburg. By the Russians. 

Her bio said she was from Main Street USA [Niki: laughs] and she was supposed to be, like, a mid-thirties woman, conservative, but with a really good sense of humor. I think you would have loved her. It's just really funny [Niki: Live, laugh, love?] [both laugh] Exactly! But actually, with a good sense of humor. And she would do a lot of pop culture jokes.

And so, for instance, they would think about, okay, what do we need to reach Americans? And they would come up with a great joke on, like, Kim Kardashian, and then they would say, okay, now we inject the bad political content. And so, it is a content strategy of designing the type of content that's going to reach the audience that you want so that then you can- I don't particularly like this term, but I think it's clear- you can then weaponize it. Right?

Niki:  Mm-hmm. So then, a couple things, first of all, I've heard you say before you don't like using words of war when we're not. 

Camille: Yes! I think that's kind of- part of the problem that we're dealing with is, we want to talk about cyberwar all the time.

We have a harder time defining what cyberpeace is, right? Like what, what does it look like to actually live in these conditions of peace with this technology? What are the rules that we want to be respecting? 

And I, I, do push back, uh, in my daily life on people wanting to use war terminology all the time. We do a lot of this in cyber. So when we did the 2020 election, um, everybody had a war room [Niki: Mm-hmm], but with the Graphika team [Niki: laughs], we had the peace room, and it was kinda like, y’know, like a satellite thing, right? Like, “Hey war room, number five, this is peace room number one.” [laughs] [Niki: Yeah] And so, yeah, we were, we were pretty big on that.

Niki: And so the idea that part of- I mean, I think- so, we are guilty. We, the podcast, which is just me in this studio [both laugh], um, guilty of having, the last two episodes are on cyberattacks, cyberwar. And I think that it's helpful to think through the work you're doing, which is saying, okay, Jenna Abrams? [Camille: Mm-hmm], Jenna Abrams is a [pause] fake influencer.  [Camille: Yes!] Created by, essentially, a government-run marketing department [Camille: That's right] that's studying our social media habits, our pop culture interests, and then being funny and engaging and finding friends and then injecting like poisonous-

Camille: It’s exactly that! [Niki: Ok] It's exactly that. And she was a fairly successful influencer that not all campaigns are successful. And I think that's really important to keep in mind that a lot of these disinformation campaigns end up going nowhere. Nobody follows them, but in that specific case, Jenna was a really successful fake influencer. Um, and she is cited in pop culture articles that still, that still are up today. Right? ‘Cause again, once in a while, she would make a really good joke and people would just want to cite her in, in a piece. 

Niki: It's unbelievable. [Camille: laughs] Well, okay. So, how, how do they, how do, this is one of the things I realized when I was [siren sounds] sorry, this is a siren. [Camille: It’s fine!] Welcome to Washington DC. We'll keep it in!

Camille: I live in New York. So I'm used to that.  Very New York soundscape. [both laugh]

Niki: One of the things that I thought was fascinating after the election is looking at memes [Camille: mm-hmm] and this sort of sophistication, like, almost more sophisticated- well, certainly more than I am. I don't even understand all the memes happening. How do they, how do people working in these troll farms learn what works for us?

Camille: I mean, you just study, right? I, I, I don't mean this in a mean way, but we're not really hard to crack, [both laugh] y’know, if you look at what people click on. Yeah. You clearly, y’ know, you quickly get the sense of [singsong voice] oh yeah, they like this, they don't like that. This is the type of joke that worked. This is the type of jokes that don't work. 

Um, what gets often complicated is, y’know, we have this idea that trolls, they do very sophisticated messages, and the message is very dangerous, and it's also always fake. But most of the time you just have people who want to build an audience. And so, they borrow the popular memes. Sometimes they adapt it a bit, sometimes it just re-posted. And so, a lot of the content of these troll farms and/or disinformation actors tend to be very benign and very mundane. It's just a meme-of-the-day type thing. 

Niki: So, meme-of-the-day and then they'll inject something that sort of runs us apart. I don't know if I'm using the word runs right. But, like, rips us apart. [Camille: Exactly] Because it's either race baiting or it's highly partisan or it's-

Camille: [interrupts] That's right. So that was very 2016]. [Niki: Okay] Um, Y’know, as I said, prior to 2016, nobody really thought about this in Silicon Valley. This was not very top of mind. And that really changes radically in 2017.  In 2017, everybody realizes, like, wow, we had a blind spot on the troll farm thing. We had a blind spot on information operations. And then you see, uh, both put together teams to detect these types of operations. Come up with definitions, come up with methods. 

And so in 2017 onwards start, what, what are really, for me, my favorite years  [Niki: mm-hmm] in this sort of disinformation, cat and mouse because we get better at catching this type of activity.  And then they have to adapt, uh, and then get better at evading the measures we put in place. 

That means that the Russian disinformation campaign targeting the election looks very different in 2016, 2017, 2018. That's the midterms. And, of course, 2020. And so, when I think of the operations that were hardest to detect in 2020, I think, for instance, of those two operations that Russia did. One targeting the left side of the political spectrum in the U.S. and the other one targeting the far right side of the political spectrum in the U.S. where this time they didn't create a fake influencer.

They created a fake newspaper on both sides. [Niki: Ah] And then they hired real freelancers, real American journalists who did not know they were working for a fake newspaper run by Russia. And that is much more complicated, of course, both to detect and to expose. And that leads to also having very bizarre conversations with the freelancers saying, like, “Hey guys. Y’know, we gotta, [chuckle] we gotta tell you something about that article that you wrote for this website.” Right? The website on the left was called Peace Data. The website on the right was called, um, NAEBC. And it's, it's, a very different sort of strategy of entrapment. And it's one that this time really targets real people who become unwitting agents of Russian and- 

Niki: [interrupts] Okay! I never heard of this until right this moment. [Camille: laughs] So, this is why we're doing this podcast. So, you're right. I'm very familiar with the 2016 blind spot. Everybody is. I know that you worked at Graphika, which is one of the two, I think, organizations that were able to see the data set of what happened. [Camille: Yeah] These experts and researchers, you being one of them and working across platforms, got together to create sort of an apparatus to deal with that.

And then, what I just heard you say, is the Russians created fake newspapers and hired real freelancers to write misinformation! Can I ask you another clarifying question [Camille: Yes, let’s do it] before we get to that? What is the difference between misinformation and disinformation? Is there a difference?

Camille: So, it's a very theoretical difference, and I will tell it to you, but then we can talk about, like, how much of this matters. [Niki: Ok] Misinformation is when information that is not true goes viral. And so, for instance, if, um, my uncle goes on the internet and says, “I am worried that if I get the vaccine, I'm going to start radiating 5G.” [Niki: That’s my actual uncle] Yeah!  I mean, that's also mine, right? [both laugh] That's just, that's the uncle problem. And they, actually, generally believe that.  [Niki: Right]  Blows my mind, but they do genuinely believe that. And it's, it's evidently fake but misinformation is that, right? It's the propagation of information that's fake and that's viral. That's a big problem on health. It can also be a real problem on, y’know, election-related misinformation can be really bad if people really think that, um, y’know, I don't know, voting machines are not, or whatnot.

That's different from disinformation because in disinformation, you have viral, fake information, but it's propagated by people who have an intent to deceive, right? So, it's people who know that what they are saying is fake. So, if I tell you, “Hey, um, why don't you go and vote by SMS? This is a great idea. [Niki: Right] And here, it's totally going to work.”  I know that what I'm saying isn't true. This is a campaign to deceive you with a strategic objective and that's disinformation. So, the, the real difference is really the motive of the person who is sharing the information. [Niki: I see] Which is why it can get really theoretical because how the hell are you going to know whether something that's fake has been propagated by someone who's there trying to deceive you, or whether it's your uncle again, who really should get off Facebook?

Niki: Well, or it could be, I mean, maybe I'm getting this wrong, but it seems like you could have a deceptive actor, y’ know, putting disinformation into this ecosystem, the bloodstream, of the social media platforms, and then it becomes misinformation when someone believes it and regular people who actually believe it to be true. Is that right? 

Camille: Yeah, exactly. Right. [Niki: or no?] Also, you would be a great troll because this is exactly what the game is about. Right? [Niki: laughs] So, the game is about [Niki: Yes, yes] trying to find online communities. Who's going to carry the waters for you, right? Like, you want to pass the baton to people who are going to go and carry this operation forward.

That's also why there’s this bizarre, awkward relationship with conspiracy communities online and information operations. Right? So, people ask what is the relationship between QAnon and the Russians. The actual answer to that question is there is no relationship, right? [Niki: mm-hmm] Like, QAnon is a problem that we have online. And the- Russia is a different other type of problem that we have online.

That being said, often, disinformation actors will target these communities because if you manage to sort of, like, plant your seed there, it's going to grow. It's very fertile ground for disinformation. We talk about the Russians a lot, but the Iranians are really good at this too. Right? In their campaign targeting the 2020 election, they actually impersonated the Proud Boys, right? Far-right group in the U.S. And they were like, “Yes, there is a fertile terrain there, it's definitely going to freak people out. It's going to work well.” It didn't work that well, but it's really interesting, right?  [Niki: Right] Like, those actors are looking at the ways in which our own political debate is fractured. They're looking at the ways in which we're polarized. They're looking at, um, our own weaknesses and vulnerabilities in these online conversations to better exploit them.

Niki: So, it's like Inception. I just watched the movie Inception two days ago [Camille: laughs], so it's basically planting an idea. And then, if it grows organically among these communities, that's easier than having troll farms where you're building and building over time fake accounts. 

Camille: It's easier, not only because now we're getting better at detecting the fake accounts. Right? So, the reason why the Russian campaign and Jenna, our friend, Jenna, right?  [Niki: mm-hmm] Fake friend.  The reason why they're effective is the Russian started doing that focused on the U.S. in St. Petersburg in 2014. And so, by the time we detect them, we're in 2017.

And so, those accounts have been online for three years, making, y’know, bad pop culture jokes. And so, it takes a long time to grow that audience. And if we get much better at detecting those accounts early, they don't really have time to build that audience, which is also why we see actors moving to other strategies like entrapping journalists and freelancers in working for fake journalists, y’know, fake, fake, uh, newspapers, y’know?

Niki: So, there are two things I want to cover off, uh, before we close out our conversation. And also, I've learned a bunch. [Camille: laughs] This is like the whole reason for having this podcast. I had no idea that there were fake newspapers and, um, that's unbelievable! But two things. One I want to talk about, um, these unwitting actors.

So, you look at troll farms. I don't know how many people walking around have talked to people, who, who worked in a troll farm [Camille: chuckles], but you have!  And so, I'd love to hear that. And then the last thing, you're a professor, and I want to talk briefly about your class and what you're teaching right now.

Camille: [chuckle] Sounds good. Um, yeah, I've, I've had the opportunity and, y’know, to, to talk to a few people on the, on the other side of these disinformation campaigns.  We can call them trolls, we can call them disinformation operators, and, um,  I've, I've really seen many different stories. I think this is what stays with me the most- the diversity of trajectories of how you end up doing this.

Um, some people are actually pretty proud of their work, right? So, um, I've talked to a guy who used to run a troll farm in Macedonian. [sarcastically] He’s soo proud of this. When you meet him, he gives you his business card and his business card says, “I'm the man who inadvertently got Donald Trump elected.” He says he's very sorry about this. [quickly] He doesn't really look sorry about this. [Niki: laughs] So, that's sort of like one way, right? [Niki: Um, you've met this guy?!]  Yes!  He's again, like, he's very happy to talk about this. He's not very undercover. [Niki: Okay!] Um, but I've also met people with very different trajectory who kind of stumbled upon this. 

Some of the stories that stay with me are notably from India, with people who said, um, “I initially joined a political campaign to do social media. And then my candidate got elected, and suddenly I kind of wake up a few months down the line when I realize, wow, I'm, I'm running a fake account, harassing journalists, uh, on the internet. I am a troll.”  Right? [Niki: mm-hmm] So there's also this, this trajectory of sort of inadvertently y’know, starting with something, maybe it's political marketing, maybe, y’know, maybe it's campaigning, and it's down the line you realize that no, now you're, you're essentially a troll farm. And then they are, uh, y’know, people who actually just got hired as trolls. It paid well, they spoke English well. Um, and, uh, and uh, eventually down the line, they're like, “Actually, that's not really what I want to do. And I do think it's weird. It's not rewarding.” [Niki: It's not rewarding]  And of course, um, it's a bit of a different story, but I think it's one that's really important to tell because we often forget about it.

One of the main reasons why we know about these troll farms is because of really fantastic journalists who often go undercover and expose what's getting on there. Right? So, the reason why we know about the internet research agencies in St. Petersburg is because of a young woman journalist who went undercover, documented everything, and with great personal risk, exposed what was going on. 

Again, this, there's a lot of different things that can happen around, around troll farms and there’s different ways to end up there. 

Niki: And it's a fascinating story. Just the human experience. I mean, I can't imagine being a freelance journalist reporting and then finding out that you're part of an apparatus to sow discord.

Camille: Yeah! Some people have reacted really well to this. And so, y’know, thinking back again in 2020, some of the journalists said, “All right, this is, I totally understand how I got trapped. And this is all my communications with the trolls” This is, y’know [Niki: Cooperating, and hand it over] Exactly! 

And sharing with the media, kind of debriefing others. I think that's really important because it helps people not get entrapped. Right? When, when we have journalists that are brave enough to say like, “Yes! It happened to me and this is everything, this is everything about how it went down” so that others could be, y’know, perhaps, y’know, more aware of these types of threats if this ever happens to them, or if, if they ever get targeted.

Because it's targeting. Right? So, like, they actually go and find specific journalists that they think their articles is going to work for, their, uh, do their strategic purposes. And then, you have journalists and freelancers who don't react like this. Right? Like, some of the people we contacted, particularly on the far-right side of the operations, said, “Oh, no, I don't believe in any of this. Also, I don't believe in Russian trolling.” You’re like– 

Niki: All right. You are! You are a Russian troll!  [Camille: laughs] Deny all you–

Camille: Well,  mean, you've been entrapped in an operation that targets us, and yeah, it's, it's complicated.

Niki: It is complicated! [Camille: laughs] So, this is sort of the, I mean, I guess what we're concluding is unless they're a bot, trolls are human too!

Camille: Trolls- absolutely human too! And they're as diverse in their trajectories. As we are. Bots is a different story. A bot is a programmed- Uh, it's, y’know, it's a, it's a machine essentially, right? Like, it's a computer program that goes and does social media in an automated way. Sometimes you do see bots in disinformation campaigns, but the complicated, strategic, sophisticated ones don't really use bots anymore. It doesn't mean that there are no bots on the internet, but, yeah. 

Niki:  And this maybe leads directly into, so- you're teaching a class [Camille: Yes], you're a professor among other things in addition to, and by the way, thank you for loaning yourself to [Camille: laughs] on behalf of America.[Camille: Awww] thank you for helping us out with our elections situation. [laughs] [Camille: My pleasure!]  But you're teaching a class and you were mentioning something about AI and maybe the next chapter of this and tracking it.

Camille:  [chuckles] Yeah. I’m, I'm teaching this semester at Columbia University. I have amazing students. It gives me such hope in our ability to tackle these issues. This week, we did adversarial exercises, and this is where the students come up with, um, an adversary target and a campaign.

Like, what would they want to do, for instance, if, y’know, they were, I don't know, uh, Iran and they wanted to target the student body of Columbia and this is something they want it to accomplish. Like, what type of TikTok things they would do. Like, it's, it's fascinating. They, they're, like, so smart and so creative.

Um, some of that is also a bit of an evil genius. So I, y’know, we'll see how this ends, [both chuckle] But, y’know, we talk about what is the future of disinformation? What type of new tools are now available to people who want to do sophisticated disinformation campaigns? One of them, that I particularly like, is what I call read fakes.

So, y’know, deep fakes [Niki: mm-hmm] is when you use AI video to do synthetic videos, read fakes is when you use large language models, uh, to generate text that can be disinformation text. [Niki: Ok] And so when I think back about the troll farms that I've seen and people I've interviewed, you do pay a lot of money for people to just sit and write, like, whatever, a gardening blog so that one day you can weaponize it. [both chuckle] [Niki: My gosh!] I know, I know!  It's a real story. 

The fake gardening blog that then ends up doing political propaganda, sort of, like, an in-between an article on tomatoes and in-between something about like, whatever. [chuckle] Um, and so, if you can use AI to generate this, y’know, large batches of believable and convincing texts, it's actually very convenient. And those technologies are very accessible. I actually think that they're fun and we shouldn't fear them too much. Right? [Niki: mm-hmm] Like, the other thing that's a trap here is we shouldn't overreact either to the threat of disinformation or to the fact that, like, yes, y’know, those are technologies that we're going to use and, of course, they're going to be used in a bad way. So, how do you not throw the baby with the bathwater? [Niki: Right] Like, how do you give people literacy and familiarity with these so that they can be more savvy about it and think about potential, um, malicious uses, but also potential detection uses.

Niki: Right! And so, thinking about how do we achieve cyberpeace by just understanding this more? I mean, I don't want to put a quote for you-  [Camille: No, Exactly!]  Your final message seems optimistic. Making sure people are educated. Understand the tools, think, look around corners for how this could be used in a bad way. But then, I'm also hearing you say don't panic.

Camille: Don't panic!  Yeah. I'm very optimistic. And again, as long as my students don't write their class papers with these [laughs]  [Niki: Right] these tools, I do think that, y’know, the next generation is more savvy and also more used to the fact that all of this is manipulated. Right?  So, um, I think there are a lot of people who reacted very strongly to the fact that you could use filters to distort an image, but then you talk to kids today and they're like, [sarcastic voice] “Yeah, that’s Snapshot, where were you?” [both laugh]

There's, there's a, this again, more, more, um, familiarity with the ways we use technology to distort and manipulate with what's good, what's bad and what we can do to be more aware. Um, and protect ourselves better from being deceived and manipulated. 

Niki: I am just so grateful that you came down to Washington to explain this- I just learned a bunch of new things- and that you're working on these issues.

Camille: Thank you. 

Niki: Thank you for coming on! 

Camille: Thank you so much for having me. This was really fun.

[music plays]

Outro:

Niki: Next week, we are taking a break for Thanksgiving and, after the holiday, we’ll be back in the studio talking to reporter Emily Birnbaum about the Metaverse. Be sure to follow Tech’ed Up wherever you get your podcasts.