
The Lancet Voice
The Lancet Voice is a fortnightly podcast from the Lancet family of journals. Lancet editors and their guests unravel the stories behind the best global health, policy and clinical research of the day―and what it means for people around the world.
The Lancet Voice
Misinformation in a pandemic
How does misinformation start? Why is it spread? What is being done about it? What is “behavioural fatigue”? The Lancet Voice speaks with Sander van der Linden, director of the Cambridge Social Decision-Making Laboratory, to find out more.
Read all of our content at https://www.thelancet.com/?dgcid=buzzsprout_tlv_podcast_generic_lancet
Check out all the podcasts from The Lancet Group:
https://www.thelancet.com/multimedia/podcasts?dgcid=buzzsprout_tlv_podcast_generic_lancet
Continue this conversation on social!
Follow us today at...
https://thelancet.bsky.social/
https://instagram.com/thelancetgroup
https://facebook.com/thelancetmedicaljournal
https://linkedIn.com/company/the-lancet
https://youtube.com/thelancettv
This transcript was automatically generated using speech recognition technology and may differ from the original audio. In citing or otherwise referring to the contents of this podcast, please ensure that you are quoting the recorded audio rather than this transcript.
Gavin: Hello, welcome to a new episode of the Lancer Voice, a special episode. Today, we're going to be talking about misinformation during the COVID 19 pandemic. So I'm Gavin.
Jessamy: And I'm Jessamy.
Gavin: For this special episode, we spoke to Dr. Sander van der Linden, who deals with behavioral science. But before we get into that interview, we'll talk briefly about the kind of really interesting topic of misinformation.
Now, I'm sure. Lots of listeners have encountered a great deal of misinformation over the last few weeks while this pandemic is going on in WhatsApp groups, on social media, sometimes even in the news and relatively unreliable sources. It's a really interesting topic, but the pandemic does seem to have multiplied misinformation because I think, Jasme, it's what we're all talking about at the moment.
Jessamy: I think that's right. And it's causing such an enormous strain on our lives and has changed so many people's working environments, home situations, ability to earn money. There's a desperate need to try and understand how this could possibly have happened. And the WHO in early February named.
there's an infodemic this kind of systemic multiplication of misinformation about this enormous global topic.
Gavin: So there's so many aspects of this pandemic that scientists have decades of experience don't understand, but I think for people whose lives have been changed fundamentally. There's this kind of completely understandable desperate urge to, to find out why their lives have been changed fundamentally.
And there's this kind of urge to get information out there, in this kind of sense that people think that they're helping in a way with this spread of information.
Jessamy: I think one of the difficult things about the time that we live in is that most of the information. Or the misinformation that gets spread is actually quite high quality.
It's normally fairly intuitive. It makes sense on some level. There are, very clever pegs to hold onto that make, even very well educated people feel, okay this could be true. This somehow simulates the reality which I understand. And so I, I don't think that people spread misinformation on purpose.
It's just a kind of human nature to, to share the information that you have. And, just part of the problem of the age that we live in is that misinformation can be done. really and cleverly it's difficult to distinguish.
Gavin: People are actually sharing this information in the hope that they help in some way.
And, it's done with this kind of almost altruistic sense of here's information that backs up the world as I see it, as I understand it. And hopefully this will help people understand things in the way that I have now come to understand them. And it's completely Understandable that people would share information in that manner, but there's so much that we don't know that no one knows about this particular pandemic, but it's causing an awful lot of back and forth.
I'm sure you've seen it over social media of claim and counterclaim. And yeah, that's why I thought this interview with Sander was a really good opportunity to talk to a behavioral psychologist about it and try and understand why this happens what happens. Governments, what social media platforms and what people are trying to do about it.
Jessamy: That's true. And I think it highlights, the importance of places like, the Lancet established medical journals and established newspapers who are able to provide some kind of safe environment where you can trust that at least the sort of basic checks have been done to. propagate information which has some evidence base.
Gavin: I mean it's obviously super important in these times that proper sources disseminate proper information and we're seeing more and more pressure on those systems because there's so much demand for information about this. I mean I'm sure you've seen on all news sites that you visit that literally the top 10 articles on every single news site have to do with COVID 19 at the moment.
Jessamy: Now, that's totally understandable. And I think it's interesting that, as we've said before, this time is inevitably going to be a watershed moment for so many things. But I think this sort of argument that has been brewing for the last four or five years about misinformation and fake news, that potentially we might come to some kind of resolution or greater understanding.
About this problem of misinformation and fake news through this pandemic, that would be potentially one of the good things that might happen from such a terrible, catastrophic time that we're all going through at the moment.
Gavin: Dr. Sanne van der Linden, you're the director of the Cambridge Social Decision Making Lab and the co convener of the Cambridge Special Interest Group on Misinformation.
Oh, thank you so much for joining me today. My pleasure. Perhaps we could start at the very beginning. So what drives people to create misinformation in the first place?
Sander: I think it's good to distinguish between people who intentionally create misinformation and people who may fall prey to misinformation and then subsequently share it, either intentionally or without their knowledge.
So generally, there's various motives that leads people to share misinformation. A big one is political motives. So people might do it for political reasons, for ideological reasons, sometimes for other reasons relating to people's worldview. And sometimes the motives are financial. People are trying to make money off of it.
But generally, the section of the population that's actively creating misinformation is fairly small. It's usually a small, active group, but lots of people fall for it and subsequently share it. And that's how it goes viral. And that's where it becomes a bigger problem every time it's shared.
Gavin: So what then are some of the drivers for people spreading this information? Is it a general kind of idea that they're being helpful or something else?
Sander: No, I think generally what happens is that people are exposed to information in a fairly stressful environment. So whenever there is something going on COVID 19, people have limited cognitive resources and we have to judge what's accurate, what's credible, what's valid in a very short amount of time.
And we know that when that happens the brain starts relying on rules of thumb rather than processing information deeply and analytically. And often those rules of thumb will not work out very well in terms of accuracy. So we might share something because we like the person, because we trust the person, because it sounds good, because it resonates with what we already believe.
So imagine that you get. shared a story about vaccine hesitancy or about COVID 19 being a liberal hoax or something that feeds into your pre existing ideas about the world that might make you more likely to share it. So that's driven by what we call motivated cognition or the fact that your perception can be colored by your motivations about what's going on in the world.
But sometimes it can also be because it leads people astray. You think it's a real story, and you just don't know how to judge what's factual and what's not. And they use a lot of techniques, and we study these techniques in our labs. So sometimes that's impersonating other people, the use of emotions to persuade people, conspiracy theories.
Sometimes it's not so simple. It's not just that something is true or false. But it uses a particular technique and people don't recognize the technique and they share it because they think it's accurate. And it dupes a lot of people.
Gavin: Oh, with so much misinformation around right now, does it become more difficult for people to work out what is or isn't true?
Is there some kind of overload?
Sander: Absolutely. So people say misinformation has always been around and that's true, but I think there's a few differences now. One is the role of technology. So the rate and the speed at which misinformation can go viral is accelerated intensely at the moment. The reach, it spreads much further.
Research shows that misinformation spreads much deeper and further than real news. And to some extent people are bombarded now with information from all kinds of sources. From official sources like the HWO, from the government, from the NHS, from also from social media. I got forwarded a post the other day, it was a Facebook post, and this is a great example, maybe.
I got forwarded a Facebook post of somebody who wanted to know whether this was true. This was on Facebook. It was a story about the Harvard Chinese professors who got, arrested for, some fraudulent charges relating to grants. But there was this subtle thing underneath the video linking this affair to COVID 19.
And so it was completely unrelated. The scientists had no relation to COVID 19, but the article is insinuating that they got arrested because they helped create COVID 19. And, lots of people are duped by these sort of subtle techniques to inject a real story with some sense of conspiracy. And it takes its own life from the moment that it goes online.
I think that's what people are struggling with, not so much the stuff that's, obviously false and people know that, but fake news kind of operates in this gray zone or where it uses these real events and then ties it to something that sounds plausible and can dupe a lot of people, and it can do damage, think about Health recommendations to drink bleach, to, to cure the coronavirus, to drink your own urine there.
There's all sorts of crazy stuff at the moment going round. And some people, let's say that you are of a particular persuasion when it comes to non-traditional medicine. It can be very harmful for people to to fall prey to these kinds of stories. And it is much more difficult now to answer your question than it was ever before for people to know what's true and what's false.
And that's why we advocate this idea of pre bunking rather than debunking.
Gavin: I'm interested in what you said there about this kind of genital association that happens in that case, for instance, of the Harvard scientists and just COVID 19 being mentioned underneath. Do you think that, to an extent, the Media's drive for clicks online leads to these kind of minor associations and then leads to like greater misunderstandings down the line.
Sander: Absolutely. I always want to be careful in, in, in implicating the mainstream media, but to some extent the incentives are wrong and they contribute. to the spread of fake news in some ways. And I know there's lots of good journalists out there trying to do the right thing but clickbait headlines are a major issue.
They draw people to to stories people don't often carefully read them, particularly on social media, and that's where a lot of people get their news and we know that's not the most reliable to put it mildly, source of of news about these sorts of affairs. People just see the headline and they share the headline and that's it and they don't read the whole article, and often it's the headline that's The quick bait.
So I absolutely do think that it contributes to it. And this is also correlated with lack of funding for, investigative journalism. Even at the BBC, real investigative journalism, training journalists in the scientific method and having scientific journalists. It's just over the years that's just gone down tremendously.
And I think in parallel, we see this pressure. To find alternative sources of funding online for these kinds of outlets and it's not necessarily their fault, but it's a shame So I do think if we want a healthy media ecosystem, we have to fund Mainstream media sources and their ability to do good journalism.
Gavin: You're probably not going to this very general question But is there a way? In a time where we have so many different communication platforms. There's so many diffuse methods of communication that this can be combated
Sander: Absolutely. I think fake news can be combated. I think our research group is perhaps more on the positive side of things than some other colleagues who are a bit more skeptical about the extent to which we can solve this problem.
We generally take the approach, and I was really dismayed with the COVID 19 pandemic. In case that prevention is better than cure. In fact, just to follow the vaccine metaphor, a lot of research shows that when you preemptively inject people with a small dose of the fake news or techniques that are used in online manipulation over time, people can build up these sort of psychological antibodies to being duped by online techniques.
So whether it's impersonating people online, whether it's using emotions to persuade people or political polarization or deflection, any of these techniques, and we have interventions that are out there that are in form of a game, it's a fake news game and anyone can play it. It's free. And we've tested that with thousands of people in a variety of contexts.
And we have worked with the government here in the UK and social media companies on implementing this. And we call it pre bunking instead of debunking just because there's so much value. And trying to stay ahead of the curve and protecting people from getting the informational virus, so to speak before it settles.
Because if there's one thing we know about cognitive psychology is that once people are exposed to a falsehood, it's very difficult to correct. It lodges in your memory, it makes connections to other events, and it becomes very difficult to correct. Even when you fact check, even when you correct, people continue to rely on parts of the false information.
Just because it takes root so strongly. And I think this is that's why there's a lot of value in trying to prevent that from happening in the first place. And that's why we call, follow this psychological vaccine metaphor that you can preemptively expose people to these sort of very weak and controlled doses of fake news and just general strategies that are being used to deceive people.
That shows that in our lab and also in the real world with our interventions, that people can become at least partially to these kinds of techniques. So we're hopeful that adds to the conversation and to the toolbox. But I do think we need a multi layer defense system. So I like to think of, let's inoculate, let's pre bunk first, if we can, to try to protect everyone.
If that doesn't work, we can do fact checking real time rebuttals. And if that doesn't work, we can always try to debunk. But we need various stages to protect people from being, quote unquote infected by misinformation. Because it, it does spread much like a virus. It does harm for society in similar ways that, that a virus does.
So I think we can, at least in our lab, we treat it very similar to the spread of a virus actually.
Gavin: And I suppose this inoculation idea becomes more useful given the move from a more public forum, say like Facebook or Twitter. to communications on WhatsApp, for example, which are closed forums.
Sander: Yeah, absolutely. In fact, we work with WhatsApp. That's one of the social media companies that we work with, because they have end to end encryption, and it's very difficult for them to intervene. And so one of the things that they were looking for is advice from academics and scientists on what they can do.
So we've created a special version of the game called Bad News. That's the main version of the game for WhatsApp, and that version will be called Join This Group, and it's really about being in a WhatsApp group and what it's like to spread. Misinformation in WhatsApp groups and how people are duped by that and going through the various stages learning to see how the, how the thing is made, essentially.
So what we try to do is that we try to take the problem where the virus deconstructed for people and let people create their own. Antidote in an active experiential setting with an interactive game. And that's what it's based on. And that's very important for what's up. We know in India people died through mob lynchings from false information, rumors spreading similar with COVID 19.
A lot of harmful misinformation on the WhatsApp platform. And, they can't just intervene in terms of content moderation. So they're looking for all kinds of of solutions turning to official communications.
Gavin: Is the Any particular evidence that says the government should try to communicate in a kind of way that cushions the blow, or should they communicate in more kind of firm, authoritative terms that leave less space for interpretation?
Sander: Yeah, that's a good question. I think in another line of work with the Winston Center for Risk here at Cambridge, we look at how evidence is best communicated. And we've been following COVID 19 pretty closely, and this is my personal opinion, but what I think is that it's very important to have a for one, for the government to test and empirically evaluate the public health messages that they Put out in the public before they actually do because they can often have unintended effects.
People talk about the hand washing message, that downplay the seriousness of it, that they talk about physical distancing too late, is social distancing the right term? People don't like that. And it's not really about social distancing as much as it is about social distancing. Physical distancing.
So there's a lot of these questions about how to accurately communicate the risks also for different age groups. And in a crisis situation, you do the best that you can. But it's helpful to think of these terms. Also again, preemptively thinking ahead of What might come and what we might need to do in stages and preparing the public for that over time rather than just bombarding them with conflicting information, changing your strategies and sending out conflicting cues.
I think that's pretty much the number one thing of what not to do. So there's a balance between Being authoritative and serious and communicating the risks, including the uncertainty about what we don't know. And we just released a paper in PNAS today about the importance of communicating uncertainty about facts.
What are the number of confirmed cases of the coronavirus? What's the uncertainty around that number? Often in the media single estimates, there's always uncertainty around estimates. And people should be aware of the uncertainty around the numbers so that they can make an informed judgment.
And we should. show that doesn't actually undermine people's trust. There's this fear that being too honest undermines people's trust in the message and in the source, which we found isn't necessarily true. But people have turned to other sources. The HWO, I think it's concerning when people start following the advice, not of the government, but of the I think it is very important for them to get their communication act in order and to have reliable science based messages that include uncertainty.
and also speak with a certain amount of of authority but based on expertise. And even though they've continued to say that their strategies are science based, I think people have come to realize that not all of what they were doing was necessarily science based.
Gavin: So yes, finally, then following on from that.
In your opinion, was there much evidence for the behavioral fatigue first cited by the government as their reason not to implement measures too early?
Sander: Yeah, this is a great example. So once again I think it's very important for people to, to trust a government in a general sense, but also, but to have a healthy amount.
Skepticism when it comes to implementing drastic public health measures. And I think this was one of those cases where something was decided. To me, certainly the evidence wasn't clear for this idea of behavioral fatigue. Actually, it's not called behavioral fatigue. It's called media fatigue or self isolation fatigue.
And it comes from some studies that show that people can get. Bored or anxious and depressed about being isolated for too long of a period from other types of context. Now, if you do systematic reviews and if you're, a scientist, maybe it's easy to see that a lot of these studies were low quality of mixed evidence no randomized controlled trials.
And so we shouldn't really be making any conclusions based on that literature. I would say the idea is real. But we just don't have enough evidence, which is different from saying that it's a bad strategy. I think that the right thing to do is to say there is actually not sufficient evidence to really base any public health measures on this concept.
We should think about timing. Timing is very important. We want to make sure people comply over time, that it's not, we're not putting too much pressure on the population. That's all valid. But I was very critical, and not just me, I signed a letter including 600 other behavioral scientists in the UK who all agreed that there is no scientific consensus on the extent to which behavioral fatigue is going to be a problem.
In fact, we can turn to behavioral science for how to keep people motivated to comply with public health measures and To stay at home and to distance themselves from other people for as long as possible because there's a lot of interesting insights We can use to keep each, you know Everyone cooperating coordinating and entertained and I think that would have been a much more useful use of the government's time is to implement early and then look at what does the evidence based say of how to get people to the to comply over time rather than just saying here's this concept that we found that, the evidence on which is emerging and not really clear.
And now we're going to base our whole strategy, or in part at least, on this idea of potentially risking, the lives of hundreds of thousands of people. I'm not an epidemiologist, lots of people have spoken about this herd immunity and cocooning, a strategy that was highly contested and not in line with with other countries and the World Health Organization.
But just speaking to the behavioral concept, I think it was not sufficiently developed. And it shouldn't have formed the impetus of a public health strategy.
Gavin: I think we could all use some of that motivation that you mentioned right now. Dr. Sander van der Linden, thank you so much for talking with me today.
Thanks so much for listening to this special episode of The Lancet Voice. We'll be back very soon with another special episode about old age in the time of COVID 19 and caring for patients with old age and the particular topics of interest surrounding old age as well. This particular pandemic disproportionately affects those of an older age.
So we hope to see you again for that one. You can subscribe to us on the podcast platform of your choosing, and you can email us on podcast at lancet. com with any feedback. Thanks so much for listening and we'll see you again next time.