Smooth Brain Society
In an attempt to change the way information is presented, we’ll be speaking to researchers, experts, and all round wrinkly brained individuals, making them simplify what they have to say and in turn, hopefully, improving our understanding of a broad range of topics rooted in psychology. Join us as we try to develop ourselves, one brain fold at a time.
Instagram: @thesmoothbrainsociety
TikTok: @thesmoothbrainsociety
Youtube: @thesmoothbrainsociety
Facebook: @thesmoothbrainsociety
Threads: @thesmoothbrainsociety
X/twitter: @smoothbrainsoc
https://linktr.ee/thesmoothbrainsociety
Smooth Brain Society
#82. How to Build a Conspiracy Theory - Dr. John Kerr
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Dr. John Kerr, Senior Research Fellow in the Department of Public Health at the University of Otago and Science Lead for the Public Health Communication Centre, discusses the psychological roots of belief and disbelief in scientific consensus. He shares his journey from neuroscience to science communication, exploring the complexities of conspiracy theories, misinformation, and the impact of AI on public perception. The discussion emphasizes the importance of understanding the psychological traits that influence belief systems and the need for intellectual humility in navigating these issues.
https://www.otago.ac.nz/wellington/departments/publichealth/staff/john-kerr
Support us and reach out!
https://smoothbrainsociety.com
https://www.patreon.com/SmoothBrainSociety
Instagram: @thesmoothbrainsociety
TikTok: @thesmoothbrainsociety
Twitter/X: @SmoothBrainSoc
Facebook: @thesmoothbrainsociety
Merch and all other links: Linktree
email: thesmoothbrainsociety@gmail.com
Hello, hello, hello. And today on the Smooth Brain Society, we're joined by John Kerr a public health researcher at the University of Otago, whose work explores one deceptively simple question. Why do some people accept scientific consensus and others really, really don't? John's research looks at the psychological roots of belief and disbelief. from political identity and attitudes to conspiracy thinking, misinformation, and how these ideas spread, particularly in Australia and New Zealand. He studied everything from public reactions to GMOs and pest control, to why certain conspiracies take hold, and even how many people are just taking the piss in surveys. Today, we'll talk about how John ended up researching belief conspiracies and misinformation. What actually makes something a conspiracy? why information is so hard to define, let alone fix, and what all of this tells us about science, politics, and human psychology. John, welcome to the show. Hello. Thanks for coming on. um We'll start when we start with everybody. And as Beth mentioned in introduction, we want to know about your journey. How does somebody come about researching these kind of things, like conspiracy theories and belief systems? I guess to go back to the start, when I first went to university, I was very interested in the brain. And there is a book that I read that led to that. And I watched some of your earlier guests and they have the exact same story. So you can probably guess what the book I read was. So, The Man Who Massook His Wife for a Hat by Oliver Sacks. And it just... really as a teenager gripped me and I was like what things that I never knew were possible about human perception and experience and it really um drew me to trying to understand how up. I've got it right here as well. now I don't say there was another book. So just just to break away from tradition, there was The Man Who Tasted Shapes um by Richard Cytowic So there was another neuroscience book that I read, a popular science book that I read. just to differentiate me from every other person interested in psychology and neurology, there was uh more than one book. So that led to me doing neuroscience at the University of Otago here in New Zealand. And it was unusual because it was an undergraduate program, but it was a really great experience of so many different facets of science. And I um enjoyed it a lot and went on to do an honors project which involved ah cutting up rat brains. ah And I spent eight hours a day at um a microtome, so I've kind of like working inside a fridge that slices rat brains into tiny thin slices for hours on end. And I thought, actually, science is quite boring sometimes. I don't know if I want to be a scientist. And after doing that undergraduate program, I went away to find myself, on traveling and ended up in London working for a academic publisher who really needed staff and took on me as a 20 year old with just a bachelor's, an honours degree. to work in the editorial team. And that was just sort of a job I got while I was on my, what we call overseas experience, OE in New Zealand. And while I was there, I got involved with people working in the press team and was actually quite more excited about talking about science than actually doing science. And that led me into a sort of career in science communication. So I did a master's degree. came all the way back to my hometown of Dunedin, went back to the University of Otago. Not that I wanted to, but it was the only place that had a Masters of Science Communication in New Zealand. And um spent two years there and did some, again, I really enjoyed the work that I did there around talking about science, sharing the amazing discoveries and things like that. And I was sort of a very passionate science communicator and really lucky. to get one of the very few science communication jobs in New Zealand after that. So I worked at an organization called the Science Media Center, which helps to promote scientific research to journalists and the media and worked there for five years and was sort of a professional science communication role. And then ah just, mean, I really enjoyed working there, but I wanted something different. And that's about when I started moving towards sort of psychology. And so working in this... science media role had, it kind of exposed me to a lot of the controversy around science. So lots of climate change denial stories in the news at that time, stuff around vaccines, genetic modification, which was a really controversial issue here in New Zealand. And I was really fascinated at where the division we saw over some of these issues came from and uh ended up enrolling in a PhD. uh at the University of Victoria in Wellington, the capital of New Zealand, to study the psychology of what we would term science rejection, or there are a bunch of other names for it. And that's when my sort of academic career started. yeah, so I looked at New Zealanders beliefs around a lot of these scientific issues and some of the psychological underpinnings of that. And then uh finished my PhD and was in the situation where you need a job. And so you're looking for jobs and my partner is British and so I was looking potentially at jobs in the UK and I was very lucky that this job landed, well I saw it in advertisement for a job at the University of Cambridge where they wanted a psychology postdoc ah but someone with experience in communications and media uh which was like this weird mix of career experience that I had. So it was like the job ad was written for me and was very lucky and I got the job and moved to England and worked at a place called the Winton Center for Risk and Evidence Communication uh from March 2020 is when I started. So it was an interesting place to work in the absolute beginnings of a pandemic where a lot of my job was understanding why people perceive things to be risky or not and how we talk about risk in public discourse. And so it was very unlucky that we ended up moving to England right at the start of the pandemic, but also from a career perspective, it was a really interesting time to be doing research on that, as you can imagine. And so I worked there for about two and a half years. And when that project was sent to wrapped up, there was an advertisement for a job back here in New Zealand, looking for someone who had scientific um research backgrounds and looking at public health issues like vaccination. which I'd been studying, and also had experience in communications and media. And so again, one of these weird roles that just happened to come up at the right time. So I moved back to New Zealand and my current job is working as the science lead for the Public Health Communication Center, which is sort of a small unit tasked with sort of promoting public health research and evidence. But at the same time, I'm a senior research fellow in the Department of Public Health and I carry on my research looking at people's attitudes and beliefs. particularly around public health issues now, but I'm still interested in all sorts of um unusual kind of belief systems or ideas that people have. Is that possibly too long or too short? I'm not sure. No, that was perfect. That's so interesting that you kind of went the science communication route and then kind of went back into academia. It's sometimes almost the opposite of what people do. And then you've brought it all together. there's sort of two breeds of kind of when you think of science communication, there are people who go into science communication as a profession and like, I'm interested in science, I'm somebody who talks about science, but I'm not a scientist. Well, you get scientists who become interested in communication and become sort of like publicly, you know, visible scientists who go on the media and things. And I'm unusual in that I've done both and, you know, my research sort of focuses on what I do practically and practically I focus on communicating research. The short thing I say is I research communication and communicate research, both in sort of equal measure at the moment. Nice. So you've kind of gone back into academia. I'll start with that maybe a little bit more because like it's public perception almost. But what's simplest way you explain your research to someone outside of academia? Because it seems you've got like a couple of different parts. Yeah, I mean, a great way into it is sort of asking people what they think about vaccination or other public health measures that they see. And usually if they don't disagree themselves, they know someone who disagrees or they've got a story about a relative who's a real ardent anti-vaxxer or who didn't want to wear masks during COVID and things like that. And I think that's a really interesting way into it to say, well, why did you have that difference of opinion? And people often think that they've just got the wrong bit of information or they were sort of duped in some way. But if you zoom out to a whole population level, you can see really clear patterns of who is more or less likely to engage in certain behaviors relating to scientific information. like during the pandemic, wearing a mask or getting vaccinated, or has beliefs that don't match up with the scientific consensus. And so certain um psychological traits are sort of related to those beliefs and also some demographic ones and things related to people's political views. And so that's sort of the broad picture of what I'm looking at. But as you mentioned, my interest covers beyond public health, things like GMOs, 5G cell towers, and a lot of that leads into the sort of really fringe beliefs which would class as conspiracy theories. And so I work with... really excellent collaboration of Australian and New Zealand scientists who are interested in conspiracy theories as well. So you work with a lot of conspiracy theorists then. Yeah, it's a hard title to try and pin down because yes, they're they're not theorizing conspiracies themselves, but theorizing about conspiracy theories. Okay, so you're not actually going along with the people in Area 55, trying to find the aliens. love to do more like direct research with groups of people who are uh advocates of conspiracy theories. um Unfortunately, I don't have the time or the resources to take out that sort of field research. um But I'm always interested in people who do actually go into those communities and sort of do qualitative research or kind of report back or even like There's a real rich vein of information and just new stories about people who went down the rabbit hole and people who came back out and what led to this sort of road to Damascus moment and things like that. So I'm always fascinated by those stories. What's your favorite sort of quote unquote conspiracy theory that way? I do, so I mean, I mean, I go with like the classics, you know, the moon landing was faked that has persisted for, you know, over half a century. People really kind of, you know, have very strong views about it. They spend a lot of time thinking about it. So I like that one just for its longevity. ah But there's also a couple of ones that I find interesting. One about um LED light bulbs were deliberately developed to dim the minds of the population. So there's sort of a mind control measure. That isn't, well, it wasn't actually a real conspiracy theory. It was one that was used by researchers. And I've used it in some of my research as sort of a test of, will people believe anything if you say it in a survey. But now I think there is actually a movement of people who think that there are negative health effects of LEDs. I wonder if that's an example of a conspiracy theory that was created in the lab, so to speak, and then escaped out into the real world and now people were actually believing it. So I always find that an interesting one. And there's a couple more around Red Bull, I think, that were designed by scientists as a sort of test conspiracy theory that have gained some actual traction in the population. I guess the next obvious jump point which I have is what makes something a conspiracy theory in the sense that people could then argue that I just haven't proved it yet like the Tuskegee Trials is one good example of it turning out to be true. um More recently if you argue that there are all these conspiracies around a global elite ring of sex traffickers being there and then you have Jeffrey Epstein. So is it a conspiracy theory? It's something that just isn't true yet. Yeah, I mean, so that's a really challenging one. both of the examples you give come up in the literature where people are talking about, I guess, a degree of humility that researchers have to have about the they don't have a monopoly on the truth, right? And I take one step back and go, what is a conspiracy? um And there's a very good clear definition that most people would use, which is a conspiracy is a group. So it's more than one person doing something in secret. And there has to be a degree of malfeasance, right? So some nefarious aim. So if you're organizing a surprise birthday party, it's not a conspiracy, unless you're to do something bad to the person, in which case it is. So there's these three sort of factors that it's going to involve more than one person. It needs to be secret and something bad. So things like people talk about the Loch Ness Monster as a conspiracy theory. it's like, well, it's, you know, kind of that's... unproven claim, but it's not a conspiracy in the same way as say the moon landing being faked is, where you've got some people doing stuff in secret and trying to dupe the public. um So that's a conspiracy and you're right, there's lots of conspiracies that are true, know, Watergate and everything like that. But where we start talking about conspiracy theories is where there just isn't enough evidence to warrant holding a strong belief that that's true. And that gets really murky, right? And a really good example of this for me was early on, we had, you know, different conspiracy theories that researchers were testing around COVID-19, right? So like, you know, COVID-19 vaccines and microchips to track people and stuff like that. And one of them was, the COVID-19 was a virus that escaped from a lab in Wuhan, because it was seen as not a very likely origin at that time. And then over time, know, government reports speculated on that and said, well, we can't prove that there's not any conclusive evidence to say it definitely came from a wet market and pangolins or bats or something. And so with that uncertainty there, know, conspiracy theory researchers sort of had to dial back and go, wait, so it isn't totally unreasonable to believe in that claim based on what we currently know. But there are other things where, I mean, some of it's just like, so implausible as to be impossible, know, like flat earth theories and that the government's sort of covering up the real nature of the structure of the planet or something, where there's just so much evidence that it becomes um pretty easy to say, well, that is not a supported claim or a term that, you know, a kind of jargon term I like is epistemically unwarranted. there's not enough actual evidence that people should believe in this. know, even things like aliens, you know, there's a degree of plausibility there where you say, well, we don't know what's out there in the universe. And if there are aliens coming and doing things, the government could be making it up. It's incredibly unlikely. So you're now operating in areas of like not true or false, but just very, very implausible conspiracy theories. And one of the things about a lot of the conspiracy theory claims is that they would probably involve a large number of people. And for that many people to keep a secret ah is quite difficult. And we know that from previous conspiracies that did come to light, only a handful of people involved and eventually someone let something slip or there's a whistleblower. And so it's something like the moon landing is very difficult to keep under wraps. You see a lot of stuff around 9-11. And again, some of the claims that are put forward by conspiracy theorists require hundreds of people to be involved in these cover-ups. and no one says anything or leaves any trace of hard evidence that would convince the average person. there is a, yeah, I think there's a degree of humility that researchers need to have around what they define as conspiracy theories. But there's also this point in reach where we say, well, some of these beliefs are unwarranted. There's not really enough evidence to back them up based on scientific research. And some people don't put a lot of stock in scientific research, which is part of one of the drivers of this maybe. But we do see, you you've got to draw a line somewhere. there is a line of thinking in sort of a more philosophical circles that some researches and conspiracy theory research take a, you I know it when I see it approach. They use that as a sort of negative way of describing how we define them. It's just like, well, this is one and this isn't. So um I would say it gets very murky with some, others, it's pretty clear that they're not. But there are... I mean, we can talk about it in a bit, but there are patterns that you see around conspiracy theories that suggest it's not, it's a pattern of beliefs that probably goes beyond just one claim usually. So I guess what we're kind of moving on to is em how much of science projection is about knowledge and how much is about identity or values? And this might be maybe linked to the attitude roots model. Did I read somewhere? Yeah. yeah. So a great, so I looked at multiple issues. This is particularly in my PhD research, which is a few years ago. I looked at a number of issues like climate change denial, anti-vaccine attitudes, but also a few other ones like water fluoridation, which some people feel very strongly about. there are number of uh varying degrees of anti-fluoridationist sort of attitudes. And it depends on the topic. That's the first thing. So if you look at something like uh belief in the theory of evolution, some people would say, I don't believe that. I reject all of the scientific evidence that supports it. I think that that's not how humans came to be. And that is very, know, unsurprisingly linked with people's religious views. So not all religious people reject evolution, but you're more likely to if you're a religious person. And so, you know, that's an example of where someone's um deeper values and ideological beliefs. have a real strong impact on their beliefs about a scientific claim. But then there are others where perhaps a lack of knowledge or scientific awareness maybe comes into play more. So in my research, genetic modification was one where we saw a link between people's scientific literacy, so how well they understood some broad scientific concepts, and their views about genetic modification. So it depends on the particular issue you're looking at. very consistent pattern across everything I've ever looked at is that people who report being less trusting of scientists in general are more likely to disagree with scientific claims. So that is like a very clear and unsurprising pattern really, but we can document it across multiple issues. Whereas just about every other issue has a unique fingerprint of underlying uh what we would assume to be drivers. So things like conspiracy thinking, people who... who believe a lot in various conspiracy theories are also likely to disagree with some claims, like vaccines is one where a bit of conspiracy thinking comes in. ah But there's also stuff like um free market beliefs and political views. So if you are really into free market economics, then you're probably less likely to uh agree that climate change is caused by humans. And that seems really weird. Like if you just say it like that. On the one hand, you've got the movement of carbon dioxide molecules in the atmosphere, and then people's views about the free hand and uh market economics. And then you think, well, what could connect those two? And you start seeing that actually a lot of what drives that is how science is impacting people's lives and society and how the world works. And with climate change, we see a lot of regulation coming in around saying, we need to tax carbon or we need to reduce our emissions in some way. So we're to bring in these economic measures to try and get people to produce less carbon dioxide and greenhouse gases. And that's where suddenly it starts butting up against people's views around how society should be structured, how the economy should work. And so that's something where it's more of a perhaps ideological, but also just like kind of values about the way the world should work and economic beliefs. which some people feel very strongly about. So the answer is it depends and it depends on the issue. then all of these things are sort of, you know, there's a relationship there, but they're not deterministic. It's not set in stone. Like I said, like not every religious person is going to be rejecting the theory of evolution. And there are plenty of conspiracy theorists who would get vaccinated, even though they believe some other outlandish claims. So we're always talking in degrees of sort of, um you know, it's only when you zoom out and look at that, like population level. that you can see those patterns really. Okay, I should say, so you said kind of like there's political, there's kind of religious, em economic as well, I guess, like somebody's not going to be very environmentally conscious if they have to stop, you know, if they're a big oil producer or like coal or AI now, I guess, as well. yeah, that's, yeah, I mean, that's a whole other kettle of fish, isn't it? But yeah, I mean, so to go back to the attitude route model, the attitudes roots model that you mentioned, that was developed by Matthew Hornsey, Australian researcher. And it's a really useful, he describes it as a trans theoretical sort of model, which just describes that you have a surface attitude like climate change denial or anti-vaccine views, particularly in relation to scientific issues. And there are these sort of underlying social and psychological drivers or roots. so things that you mentioned, politics and ideology, trust in different groups, but also vested interests. So like if I worked for a coal mining company, my job sort of depends on, ah you know, carbon producing industries. And so I'm less like, I've got a vested interest there and I'll think about climate change in ways that probably make me feel a bit better about the work I'm doing and sort of justify it. And there's a sort of process for that called motivated reasoning, where we try and draw a conclusion that we like rather than one that's accurate. um And so, yeah, that vested interests element comes into a bit. And then some other smaller things like phobias is one. So, ah you know, there's this... I'm not sure to what extent this is a big part of anti vaccination views, but people who are scared of needles might be less likely to get vaccinated and maybe justify that by looking at sort of arguments about why they don't need to get vaccinated. you know, maybe vaccines are dangerous or maybe the disease isn't something to worry about. So I don't need to be vaccinated against it. And they might be drawn to those sort of views because they're scared of needles. Yeah, I think that's one of the smaller roots. But it's an interesting one to move away from just talking about politics and ideological views as well. Yeah. Is there other common like psychological traits or just general traits or conditions that predict who's more likely to believe conspiracies? Yes. So, yeah, I mean, there's lots of research in the last sort of 10 years, there's been a sort of explosion of people looking at this. So there's things like people talk about collective narcissism. So you view that your group is better than everyone else's. And then you start finding reasons why other groups might be doing better is because they're cheating or doing something secret. But there's a really nice framework of sort of some key drivers. in the social psychological literature of why people would believe in conspiracy theories. And there's like three. the first is uh it's epistemological. So it's about understanding. It's about trying to tell a story about the world. conspiracy theories are really good at providing simple explanations for really complicated and big things that happen. you know, someone flies a plane into a tower in New York. or there's like a sudden pandemic that causes everyone to stay at home. That's a huge change in people's lives. It's scary. And one of the ways that they try and make sense of all of that is like, well, how did this happen? How could this happen? And a conspiracy theory provides a very nice, simple explanation that people, some people will find um reassuring or uh helps them to understand the world. So that's the first one. Second one is existential. So it helps them to deal with the threats they face in the world. So. Again, having a conspiracy theory out there might make people feel like they understand things and therefore feel safer because they know what's going on. And then the last one is social or relational. And that's actually the conspiracy theories tend to draw a community around them. And so if you believe in a flat earth, you might hang out with other people and go to conferences and make friends. And suddenly that belief is actually tied in a lot to a lot of other things in your life, right? So if you were to stop believing that the earth was flat, suddenly your friends wouldn't be friends with you anymore. You um might not be able to hang out with the people you wanted. You might not get the support that you were getting. And so then that can also be something that helps to um sustain those beliefs. And indeed, if someone's lonely or doesn't feel like they have a community and they start interacting online or in person with someone or a group, then suddenly that's fulfilling a social need for them. That's m a nice framework of three social and cognitive drivers of why people might believe in conspiracy theories. I quite like that one. And there's good evidence to show that um if you try and look at those underlying things, there's usually an association there with believing in conspiracy theories. It sounds almost like cults kind of come about, know, want their own narrative to kind of explain it, other people who feel the same way, kind of all kind of coming together. Yeah, and I imagine I'm not an expert on cults by any measure, but I know that a number of cults invoke conspiracy theories as an explanation for why other people aren't believing in them, that there's someone out there trying to suppress the knowledge that they hold. So there's definitely overlap there. And often they make claims that are not supported by wider evidence as well. I mean, the last word which you said, evidence kind of helps us nicely segue into the next thing which I wanted to ask was a bit more about the misinformation aspect. Like conspiracies, I guess, are formed based on certain amount of information given to them, um which for better words, probably incorrect. But then I guess in terms of covid vaccines, there's a lot of new research, new news being published every day. And I guess based on information changing each day, that could have also played a role in why people develop certain beliefs. yeah, so I um was going to ask basically of like, what do you consider misinformation? What kind of work do you do around it? Because I'm thinking that this is sort of like the root for conspiracy theory to form is, yeah. Yeah, I mean, so to start off with, I consider myself someone who studies misinformation in various ways. At the same time, think misinformation, and this is my hot take, I think misinformation is like a bad term. It's become an unwieldy word to describe a whole bunch of different things happening in society, the way of accusing people of lying or undermining someone else's argument, uh or it can mean a whole range of different things, depending on the context you're talking about it. And even academic researchers don't really agree on good solid definitions for this term. So I think as we move forwards, there's going to be a more nuance in the way we describe different kinds of information that could be incorrect or misleading. um I think there'll be a better taxonomy, a better system of words to describe this. But for now, people talk about misinformation and they can mean a lot of different things. um Generally, it's information that's wrong. But then when you look at misinformation research, a lot of it isn't necessarily wrong. It's misleading. So a great way to sort of set that up is there was a headline in a Chicago newspaper which says, um a doctor died two weeks after getting the COVID vaccine and the CDC is investigating why. That was the headline. And it was in 2021 and in the quarter that it was published, it was the most viewed news article on Facebook. It was the most shared. And it was absolutely true. As a headline, it was absolutely true. But it sort of implied that the vaccine had killed this doctor. And that did not, as it transpired, the CDC investigated. They didn't think that the vaccine was involved. But it sort of was, a lot of people labeled it as misinformation because it was misleading people. but it was also technically true. And so you got this complicated sort of dance of language. And even like we were talking about with conspiracy theories, know, what is true or not true becomes a very difficult question when you start getting down to things which are uncertain or based on, um you know, scientific research, which can't conclusively say that something's true necessarily. But misinformation, generally you can start thinking about, well, people see stuff online, they hear things from other people, which ah might lead them to hold beliefs that are against the best available evidence. And that's sort of the kind of ground truth that we end up as just saying the best available evidence. Because things could change, more information could come to light, would, you know, suddenly it turned out that that was true. But at the time we didn't really have enough information to say that it was true. yeah, and misinformation is, you I'm talking in this big broad kind of term now, this umbrella term. It can be problematic, right? So people can get information and do things that are harmful to themselves and others. So um a good example during the pandemic, lots of people were taking all sorts of weird remedies or preventative medicines for COVID, right? So like herbal remedies, we saw a lot of around ivermectin and things like that. um People drinking various sort of cocktails of substances. And there's a really nice article that catalog all the incidences around the world. of people doing this and getting sick poisoning themselves accidentally because they thought they were taking a cure for COVID. They didn't do that spontaneously. They read it somewhere or they got a piece of information and we'd see that as misinformation. And that's a good really concrete example of how bad information, incorrect information can lead to harm. So yeah, that kind of sets a good argument for why we should think about what misinformation is and where it comes from, how it spreads and how to address it. And then you see things like anti-vaccine misinformation probably has some impact on people's views about vaccines. I'm always a little bit careful not to say that it's the root cause of all vaccine rejection or vaccine hesitancy, because there are lots of other factors that come into play. But certainly reading stuff, scary stuff about vaccines isn't going to help. And for some people, does push them further away from deciding to get vaccinated. particularly a lot of this during the COVID-19 pandemic, but we also see it now with routine vaccinations for children. um So yeah, I guess that's my of like overview of what I think about misinformation as a term and the difficulties around it. I think we should do something about it. I'd like to get better words that more narrowly define what it is we're talking about. A group here in New Zealand use the term harmful and inaccurate information. which is sort of saying it's inaccurate. So it's not necessarily saying false, but that there's this potential for misleading conclusions to be drawn and that there's harm. Cause there's loads of stuff out there that people believe that doesn't really impact their day to day lives or anyone else. You know, like a classic science myth, right? We only use 10 % of our brain. You could class that as misinformation, but we're not going to be going out there trying to correct that view with everyone. mean, it's nice to be able to get people not to think that, but it's not It's not an urgent issue that we need to deal with. Whereas working in public health, often there are things where actually in the big picture, people's lives and wellbeing are at risk. And so I like to think that that's more where things focus. So talking about harmful information is also a good way of distinguishing that. And then there's this whole thing around intent, right? So we hear about people talking about disinformation as misinformation that's produced deliberately. to try and fool people. Often it's talked about in a geopolitical context, so there's a war going on and one side might be trying to spread disinformation to the other groups to try and create societal conflicts. That's a real problem. I think it's useful to think about it that way, but also it's very hard to prove intent sometimes because we don't know if someone is genuinely believing something or just trying to stir up trouble or disrupt things by spreading disinformation. So there's these challenging kind of factors around how we talk about it, which I just struggle with every day. You know, I wake up and I think, how am I going to define this today? But I think as time goes on, we'll move towards a better way of talking about it because it's becoming a very overused word, like lots of politicians use it in different contexts, sort of got a bit of mission creep around what it actually means or if people are trying to address it, what they're trying to do to address it. So... Yeah, I've got. news falls into that as well a little bit. I mean, fake news is exactly the precursor to this. It was actually a term that was being used by academics and then got co-opted by President Trump and suddenly it became a totally different kind of word and what everyday people were using it for. So it suddenly became just an easy way to say, oh, that's not true, fake news. Yeah, so maybe it's going to be always a game of cat and mouse where we're just trying to come up with more more academic-y ways of saying things to try and differentiate it from the way that it's used in just general speech with everyday people. You need to make sure the words have multiple syllables so they're harder to pronounce. Fake news is very easy to co-opt. It to be like 12 syllable long words. that's why I like epistemically unwarranted because it's just so saturated in academic language. I mean, a really nice way of thinking about it that Katie Starrberg, a researcher in the US, I think she's in the US, the way she talks about it is rumoring, which connects it, you know, takes it out of this whole social media environment and actually connects it to a long history throughout human society. of people sharing information which could be false, but it's sort of useful and it gets spread around. So I quite like that long view of what it is we're talking about. Do you think that, because I know what you're saying about kind of the news and also politically, this happens a lot as well, great example with, you know, m doctor who had vaccine dies two weeks later, that it's not wrong, but when you're putting those two next to each other, of course, people are going to assume that they're connected. So do you think the government need to kind of get more involved in this kind of, different types of misinformation, intent, not intent? technically not misinformation, but you're misleading somebody into kind of viewing that. Do you think the news and the government need to take more accountability or how can we stop it, you reckon, a little Yeah, that's really tricky because as soon as you move into a regulatory space, that definition becomes so critical, right? It's like, lawyers love legal language because it creates precise rules. And it's very hard to come up with something that wouldn't be potentially abused by say an authoritarian sort of leader or government if they came into power. And it's, you know, Even with vaccines, like there was this risk around social media where people were expressing concerns. And I think, you know, it's completely valid to have concerns about a newly developed vaccine and people should be able to say, I'm not sure about this or, and, know, I, I heard something that, you know, maybe some people are getting sick with it. If you start saying, you're not allowed to say that we're going to censor you, we're going to do something to, to prevent you from talking that way. Then it's a really slippery slope. And so I think it's very challenging one. And I don't think there's really a place for government necessarily to start making rules about what you can and can't say. I don't think censorship is the answer. I do think there's a couple of things. One is sort of helping people to become better consumers of information, to think more critically. And that is not easy, not easy at all. So I put that out there. was like, yeah, great, do that. That's the solution. However, it's a very difficult thing to do. uh one opportunity when people are at school, young people, you can give them some sort of media literacy or digital literacy sort of um instruction and helping them to think critically about the information they encounter and looking at other potential sources to try and triangulate whether something's a valid or credible claim. uh And so that's one thing. And the other one is sort of putting a little bit more pressure on social media companies to not censor people, but to have some checks and balances and how information is promoted and spread. So the thing with a lot of conspiracy stuff is that it's interesting and it gets attention and those algorithms then push it up, right? Because it's controversial and it's exciting. And then that is the danger there that the, not that people are necessarily talking about it, but that these algorithms are then amplifying it so much because it's controversial and it's interesting. And then that risks spreading it much more quickly. and more widely. So people talk about introducing friction into sharing systems. So just things like getting people to click a button saying, you sure about this before they share something can help or adjusting those algorithms so that claims that suddenly seem like they're getting a lot of attention are sort of um checked that they, know, who's all endorsing it and is there a particular sort of... a lack of evidence behind it. And some of that comes down to fact checking where actually maybe a person is involved in moderating that to some extent. Again, not telling someone, you can't say that, but simply just putting the brakes on how it is spread through a social media network. So those are two options there for dealing with it. But I definitely would be very opposed to any sort of rules about what people can or can't say. We saw some of this during the pandemic, different countries uh were putting pressure on both individuals and social media platforms to suppress anti-vaccine um sort of information and views. And, you know, there was some fact checking going on and things got removed occasionally, but uh there was also the risk of people just having normal conversations about vaccines. So the AstraZeneca vaccine, um there was a known side effect, a very, very rare side effect. But people had a right to know, right? And it would have had an impact probably to some degree on their intentions to get a vaccine when they hear about it. But that doesn't mean that you should hide it and suppress it. I mean, then you really are in a conspiracy theory, Like that's, uh you're hiding some harm from the people. And it's very rare. And some people will misinterpret or overestimate the risk because they're not great with numbers. But I think we still need to be transparent about it. That was the most important thing was to not, you not try and cover that information up. so, sorry. Yeah. it, even if there's a bit of misinformation. But yeah, I realized as I said that, it's that made me sound like, yeah, do we need to put, it's like, it's meant to be a bit of a devil's advocate almost, where it's be like, should we, should we stop free speech? Sorry. without thinking about it too hard, a regulatory option makes perfect sense. You'd say, we should just prevent people from saying stuff that can lead to public health harms. You know, and some of it is, some of it is just people making money out of, now I'm doing podcast mode. Some of it's just people making money out of saying controversial stuff online, right? Like, and that becomes a business model for them is like, I'm an anti-vaxxer. Lots of people listen to me. So I'm always talking about this. And so they've got a vested interest in spreading it, but they can, you it's still very hard to crack down on them and say, well, you're just not allowed to do that. Yeah, what's the old saying goes, no attention is bad attention or something or something like that. Or publicity, that was kind of there but not quite. The other one is controversy sells, isn't it? Yeah. And that's another big thing about the social media companies and potentially another web-blog-pory option is to say, well, people can say what they want on these platforms, but if they're monetizing it and if they're getting, you know, through your platform, know, YouTube pays you for the number of views you're getting, right? Or through advertising. And so you're monetizing people saying controversial things and that becomes their profession as to like, I say, crazy stuff online and because people buy into it, I'm getting money. And so they've got a motivation to keep doing that. uh that, you know, the platforms have to take some responsibility in creating that motivation and enabling people to keep talking about stuff because it's become their livelihood. See that's our issue, but we're not controversial enough. Otherwise we'd have way more viewers. Do you want to sell your soul? Do you want to your soul? Let's see how desperate I get in this economy. We're good for now, but as someone who studies these kind of beliefs and the communication around them, I do think if I ever wanted to switch teams, I'd make a very good conspiracy theorist because I, you you start thinking about what really drives people, you know, um how to get the most attention. And so, you know, I think, well, if everything, you know, if I end up getting made redundant or something, maybe there's a career as a professional conspiracy theorist in my future. Yeah. Yeah. I'm learning here. Well, speaking of, then what kind of people would you target? Like, so far you've said that mostly speaking, like, guide on boost our conspiracy cults. I see you. I think it's a great way of looking at the idea of it, right? It's like actually if you were gonna start a conspiracy theory, where would you start? And I'll go well, I'd pick a marginalized group, people who feel like they've been left out of society, one that's got clearly defined group. So it could be an ethnic group, it could be something a bit more around their religious beliefs or where they see themselves geographically in society or anything like that. But one that is a very clear group and then I'd find an out group that they don't like. and I'd come up with reasons why they are the source of the troubles for this particular group. um And that would be a starting point, looking for marginalized communities, people who don't trust institutions, um which generally goes hand in hand with being at the lower levels of the socioeconomic ladder. yeah, that would be my first starting point. And definitely anyone who says that I'm wrong, my thing. Sorry. I was gonna say, I feel like you're gonna have to come up with a book on how to build a conspiracy theory. Yeah, I think it's really great way of thinking about the issues. Yeah, but I'd also make sure that anyone who I see I was wrong, you know, they're part of the conspiracy theory. So it's like a self-sealing belief system. Anytime there's any evidence that would suggest I'm wrong, that's part of it. It's all a strategy from the bad guys to try and keep us down. Yeah. Also, I need your money to help me play them. That's really important. Yeah. Yeah. Well I look forward to this your TED talk on how to build a conspiracy theory in the future. or the live action movie or something. The documentary about it later. yeah, and interesting, like there is a good example of that with the birds aren't real guys, right? So they, as a satire, as parody created a conspiracy theory. It was just so farcical um that, you know, but I don't know how many people actually believe it, but it was a really nice case study of sort of building a narrative from the ground up around a conspiracy theory. Well, building a narrative from the ground up, you mentioned a lot of examples which researchers used, which turned into actual sort of conspiracy theories, right? Or conspiracies, not theories, but like, yeah. What was the one you had given at the start of the episode as one of the examples, which was put in a psychological survey? LED light bulbs being deliberately designed or used by governments to sort of the minds of the population. There's nice sort of lighting pun in there as well, dimming the minds through light bulbs. Does where the misinformation or conspiracy originated from matter? Is that a factor in any way or no, not really? I mean, I guess that you think about in different levels where the information comes from to an individual like who's saying uh it does matter. So people you trust, you're more likely to believe and people you don't trust, you're less likely to believe. So if you are less trusting of scientists, then that's going to come into play when scientists are telling you, you know, that conspiracy theory is false. uh But then also, I guess maybe another way that you could answer that question is sort of who is doing the conspiracy. uh And that's a big one. Often the government, right? And that ties into people's lack of trust in institutions. They may feel hard done by by society and they see the government as sort of doing that. you know, governments have done some shady stuff in the past. So it's not totally unreasonable to be suspicious of them. I think it's probably good to be a bit skeptical about everything the government says. But then, you know, eventually that might reach a point where you're attributing things to the government, which are quite outlandish, right, or very difficult to actually cover up. But there are other things, mean, there's shadowy new world orders is a good one, a good source of like the bad guys. And that's very easy because they're hidden and you can't, you know, if you can't see them, that's all part of it, right? Like the secret and the hidden away group that are controlling everything. And again, there's an element of truth to that. There are world bankers and financial systems which do have huge power over the world. They just operate in much more boring terms around currency conversions and things like that. World bankers rather than, you know, setting up elaborate conspiracies to influence people through microchips and vaccines or something like that. But the UG, the UG is a sort of kernel of kernel of truth to some of us, you know. um I think about a really classic one. We had the Christchurch earthquakes in um New Zealand in 2010, 2011, uh really big earthquakes that damaged a major city in New Zealand. And one of the conspiracy theories around that was that they were caused by the US military using this uh atmospheric research program called HAARP. And just about anything bad, any natural disaster in the world, someone attributes it to the HAARP project, which is like the High Altitude Oral Research Program, something like that. And so there's this kind of idea of like, well, you know, it's crazy that people could just create earthquakes. But then you look at something like fracking and fracking is known to trigger small scale earthquakes. So actually, there are situations where people have created earthquakes deliberately, or, you know, not necessarily deliberately, but through human action. And so suddenly you're like, okay, well, you know, there is an element of truth that people can do that. It's very unlikely that it happened in this situation for a whole bunch of reasons. But there is an element there where you have to say, well, OK, technically people could um trigger an earthquake through mechanical or technical means. um I was involved in a really good study that looked at earthquake misinformation where this whole issue came up. We were asking experts about these sort of what we thought were misinformed claims. And they always kept coming back with, well, technically you might find actually that humans can cause earthquakes under certain circumstances. So there was all this cavity adding that they did, which really made me reflect on, you know, there is an element of truth to some of these claims, depending on how you interpret I mean, another similar example would be about, I don't know if there's any conspiracies around it, but about cloud seeding. So particularly in the Middle East, you do a lot of cloud seeding because it's a desert area to get some water. And I think, was it last year, it actually caused floods in a few major cities around the UAE? Not major floods, I think, but it comes to the point that if done correctly or in certain areas, you could, in theory, generate floods or storms if you wanted to. yeah, yeah. Sounds like it'd be very difficult to do reliably, but yeah, that's a possibility. yeah, people get worried about contrails, right? That's another conspiracy theory. The trails left by planes are actually a release of chemicals, which you say, well, no, that's just physics and water vapor from planes. But there are situations where planes have dropped chemicals on people, Agent Orange and things like that in Vietnam. So again, there is a possibility that that happened. um And so we're sort of, you can't say, well, that's just never, ever going to happen. But the context in which you're talking about it is very unlikely. That's where you end up with your mindfulness of a plane. Yeah. Almost like a little bit of truth, isn't it, is feeding. um Even things like the microchips and vaccines, right? First of all, Bill Gates pushing vaccines. What does Bill Gates do? He makes computers. does nothing. He nothing to do with vaccines. Why is this computer guy suddenly involved in vaccines, right? If you're thinking about it uncritically. And then there were lots of instances of people talking about implants and having things in you. And so suddenly you're like, well, I don't trust Bill Gates. He's a computer guy. He's suddenly pushing vaccines all over the world. And you start drawing the red wool between different post-it notes and suddenly it makes sense. And so there's always this underlying kind of best of logic or reasoning that helps hold it all together. Yeah, no worries. guess a six degrees separation from everyone in the world apparently. So you can make anything work if you want it to. m Yeah, I think the only thing I've had with the microchips one is you already have your phones, you're already being tracked. I don't see the additional need at this point. yeah, they've already done it and we do it, we pay handsomely for the price of being tracked as well. So no need to implant people. Do you think that distrust in science is increasing or just becoming more visible? Great question, a tough one. ah Depending on who you ask, really, right? So I think one thing, during the pandemic, generally, we saw trust increase because people were scared. We talk a lot about misinformation and anti-vaccine attitudes. The vast majority of people are trusting of scientists. And in a situation where something scary is happening and we're relying on scientists to tell us what's going on, I think... you know, it was unsurprising that in the early stages of the pandemic, there was an increase in trust. In New Zealand in particular, where that data has been sort of collected over time and longitudinal studies, I can say that sort of came back down to baseline around 2003, 2004, and it hasn't really shifted. So generally, in the context that I have the best sort of insight into, which is locally here in New Zealand, hasn't really increased or decreased. The US, there is sort of claims that anti-science attitudes have generally, where they've been tracking it much longer, have generally declined over the last sort of 50, 60 years. I'm not sure to what degree that is. And the really hard thing about that is the way people think about what science is when you ask those kinds of questions could have easily changed over a couple of generations. So the way that science was presented to people back in the 1950s, was like nuclear age. science, better living through chemistry. know, like science was much more about just the benefits of science. And now, you know, research is very different. There was a lot of people talking about the risks and harms of new technologies. And so when we ask people about what do think about science, it's sort of a, they think they might be thinking of different stuff. So over that long-term, it's quite hard to actually definitely say, people worry about it a lot more now though, right? We definitely see people worrying about the Post-truth era and a loss of trust in the experts. There's a famous quote from Michael Gove, which is sort of a watershed moment. So Michael Gove, the UK politician saying, you know, the people have had enough of experts and that has become like a very defining quote around sort of anti-science concern where they say that it's a reflection of the sort of post-truth zeitgeist where people don't buy into science anymore. If there has been a decline, it's been small. generally, most people around the world are involved in a big multinational study that looked at science attitudes around the globe. Most people are trusting of scientists. Some countries, perhaps a bit less so. But, you know, on the whole, we do see science as one of the best ways of explaining our world. When people who are experts on something talk about it, we're probably more likely to trust them than politicians or people on social media. So... I think, yeah, I wouldn't want to overstate any decline of trust in science, if there is one, but it's possible that it has changed over time. the hangover of the pandemic may have some influence on that, where uh that trust went up and then people became more frustrated with things like lockdowns and sort of saw science as playing a role in having an impact on people's lives in a really tangible way. And that might have pushed some people to think, well, uh less keen on science now because I had to stay at home or I to get vaccinated. In some instances, here in New Zealand, there were vaccine mandates, so certain people in certain jobs, if they didn't get vaccinated against COVID, were not able to work. That's a really big way that science has impacted your life, right? And so that can influence some people's views as well. I guess, I mean, just for the New Zealand example, I could counter it. I don't know which other countries had mandates for certain specific roles, but the argument would there be that's not the science, that's politicians making that decision to mandate certain things based on evidence given being like how helpful it is. politicians are the ones making the decisions. But if you look at the discourse here in New Zealand, and particularly the UK, there's a lot of putting all the blame on science, right? So they say, we're following the science. They're not saying we've weighed up all the socioeconomic costs of this. They say, we're following the science, need a lockdown. And so that does place a lot of the blame, if you're thinking of those terms, on scientists, in terms of what people are hearing publicly. saying that the scientists are doing this and the scientists are overreacting or they've got their own interests that they're trying to pursue. yeah, it's a tricky one and I think politicians definitely get the blame as well. So we saw some backlash against politicians around the world for various lockdown measures but also scientists. You mentioned the famous quote of course, we've had enough of experts. But have you, this is off topic kinda, have you ever seen, oh it's always sunny in Philadelphia? Or have you heard of the show? Yeah, not every single episode, but I'm a fan. But yeah, there's this one episode where they're debating trust in science and it's like science is a liar and then dot dot dot dot dot sometimes. And then the examples he gives was like, what was the one Galileo people thought this here's the smartest person in the world. He incorrectly thought that there was an optic that comets were optical illusions. And he said the moon could not cause tides, making him an idiot. And then he goes down and he's like Isaac Newton. Despite his intelligence, believed that he could turn lead into gold or something and died drinking mercury. So he's an idiot. that kind of thing just reminded me of. Yeah, I mean, like I do think, you know, when I was talking about going into science communication and I was like coming out of my undergrad and I was really fascinated by it. I was really like a science fanboy, you know, and just reading new scientists and thinking science was like awesome. And I still think it's great. But as a institution, as a group, like scientists are people and I think I've got a much more humble view of scientific research and what it can and can't do. having read some of the more critical research around how science operates in society and the way that science influences politics and things like that, I've got a much more balanced view. So I recognize the limitations of science and scientists. And definitely, the argument of, you should believe someone because they're a scientist is... you know, the appeal to authority is, um you know, it's pretty dangerous because there are lots of scientists with very, you know, out there incorrect and potentially harmful ideas. You know, we saw lots of professors, PhD qualified researchers during the pandemic saying things that weren't supported by the evidence and sort of, you know, claiming that they were the Galileo of their time, that they were, you know, right about everything and they've been suppressed. um And if you use this, but it's tough for people who are not you know, well-versed in scientific information and understanding data and evidence, because what do you rely on as a guide for who to trust? And often, expertise is held up as saying, this person has studied this, they've gone to Harvard and Oxford, and they've done all these things, and they've got all the credentials, so we should listen to them. But, you know, that's, again, an appeal to authority, really, at that point, and not actually people can't... judge the evidence for themselves because they don't necessarily have the skills. So it's a really tough one because it's easy to say, well, you shouldn't just believe what people say because they've got a bunch of letters after their name. But at the same time, people, we shouldn't expect everyone to be a scientist. yeah, we'll need to, there's a lot of different challenges that come up with that. And that's what the whole field of science communication tries to address to some degree, but it's challenging. So what do you think is the question about belief or misinformation we should be asking next? What do think is the next big question you want to answer? I mean that's an easy one now. If you'd asked me two years ago, would have had to say, I'm not sure. But now it's easy, AI, like it's just changing everything. And the potential for AI to lead people to incorrect conclusions is huge. I mean a little bit because it hallucinates and it makes mistakes sometimes. I mean, I'm pretty impressed with... the reliability of the information it provides. um I ask it about my own research. No one knows that better than me, and it seems to have a good handle on it. So I'm like, OK. But there is the potential for it to be abused. So you can get AI to convince people to believe in conspiracy theories. um And there's a very good study that just came out recently which did exactly that, um where you can use those large language models to to do all sorts of things at scale. So you had a bunch of people in a Russian troll farm who were trying to disrupt American society by uh spreading disinformation about certain groups or trying to create protests or something like that. Now you don't need a bunch of people at keyboards. can have AI agents doing that and targeting it to each individual person. You can really... uh based on a few simple, easy measures that you can get from social media and metadata and things like that, knowing someone's age, a few things about their interests, you can really target them with the exact kind of information that's most likely to change their mind about something. And that's terrifying, right? Like it used to be people would put ads or something on Facebook and hope that, you know, hits the right groups, but it sort of hit a miss. But AI, every single message can be tailored to an individual person. to get them to vote for a particular candidate or to believe in a certain conspiracy theory. And of course, not everyone's going to fall for it, but it's going to be way more effective than a one size fits all approach. And I find that a bit scary, but it is a double-edged sword. So the group that put out that research showing you can use AI to convince people of conspiracy theories. Previously, I've done a study showing you can use AI to disabuse people of conspiracy beliefs, like to convince them out of it. So there is a benefit there that It could be a very effective fact checker for people as well. It's just a challenge of who has access to that tool and how they're wielding it. But it's changing so fast as well. changing, like, you know, I think about information that people get and their beliefs about scientific issues. That's my interest. But it's changing everything from, you know, how people work. Suddenly, you know, employment will probably become a major issue of the next decade. And that's going to disrupt everything in crazy ways as well. So yeah, easy answer. What's the next big thing? It's going to be how AI influences people's beliefs about science. your £2 billion study is investigating AI. eh Yeah, I mean, I guess I just want to the future, right? What's going to happen? That's the scariest part about it. Yeah, as you said, everything's happening so quickly, changing constantly. Yeah, and it's not just, I guess you were talking about who has, who's wielding the AI tool, but it's also who then can change the code, how easy is it, what information is a particular AI model running on, which makes a big difference. um I vaguely remember having this conversation at a science communication conference where, oh, where I think we had, there was a panel debating sort of like what's the... limits of AI and there's one who is like sky's the limit and the other one's like, no, we need to really have regulations on what data we feed eh into it. And I don't know if you have any opinions of like where you stand on sort of regulating how AI's models are built and things like that. Yeah, I'm no expert on AI models like the inner workings. I don't know if anyone is, you it's a bit of a black box. But I do, I would say that one, like it's gone way too fast without proper guardrails and people making sure that is, you know, even if we assume that everyone has the best of intentions and there's no bad actors out there, there's still all of these negative consequences that can occur and people haven't adequately tried to block those. And then now that the genie's out of the bottle, like there are a number of models out there, you can't suppress the information on how to build one. um The risk that people will use them for nefarious purposes is pretty high, right? Once you have access to something like that and you want to change people's views or you want to use it to disrupt an economy, you know, the opportunity is there. So yeah, it's pretty scary. But yeah, to answer your question, I think that there has to be some degree of regulation and holding these companies back. I don't say like, just get rid of AI, but just forcing them to demonstrate safety and security better. um But m that is not my area of expertise. I would say that's my um uninformed view as a member of the public more than as a researcher. so we'll move on little bit to little bit of a more niche topic, shall we say. We have to talk about the genetically modified warrior raccoons. What was that about and what did it reveal? Yeah, I mean, so earlier I talked about people believing in conspiracies like the LED conspiracy, which was made up for the purpose of a study. um And it sort of gets to the heart of why people say they believe in conspiracy theories. And maybe they don't really hold that belief, but then they just endorse it in a survey. Maybe they think, that seems likely, know, government does shady stuff. So I'm going to say agree with that claim. But there's also an element of people just that you kind of have to recognize is happening. So a lot of these surveys happen online with anonymous participants. They recruited some through a platform. So there is an element of trying to keep people sincere in these surveys, but there's always a risk that people just give silly answers because they think it's funny. And so as a way of trying to measure some of that, mean, previously a researcher had talked about, a way of trying to say, there's always a degree of what people say in surveys that you probably shouldn't believe. um And he took an example of the number of people who say they believe that the world is secretly run by lizard men, which is a classic conspiracy. there's around about 4 % of people in surveys would say, yeah, I think that's true. But if you scale that up to the entire US population, that's like hundreds of thousands of people. And so... you know, that probably doesn't reflect reality and some of those people aren't being true. a researcher in Australia, Rob Ross, did a study that I was involved in a replication of, an extension of, led by Matt Williams, so if you could put it to him. He's a researcher at Massey University in New Zealand. Where basically in the survey, they asked questions about conspiracy theories, but they included one conspiracy theory that was so outlandish. that would be very unrealistic that anyone actually believed it. I've got it written down because it's quite long. So this is the claim. The Canadian Armed Forces have been secretly developing an army of genetically engineered, super intelligent, giant raccoons to invade nearby countries. And if someone said that they believe that, you just have to think, I don't know if they really do. It's so outlandish, it's so farcical. that they're probably just having a laugh, right? uh And so they use this question in the survey as a way of capturing the element of trolling or people being insincere. And uh so you could say, well, look, some people, you know, and it's about maybe five, six, seven percent, depending on the sample, uh of people say that. And you could go, look, there's like hundreds of thousands of people in Australia in this case who believe in this crazy conspiracy. So the point is don't extrapolate out from this kind of research to a population level, because it's probably not accurate. But also if you account for that, if you take those people out of the survey and say, look, I think these are insincere responders. What happens to the relationships that you're trying to study in actual research about conspiracy theories? There's a lot of what we think of common patterns. So one is that people who believe in one conspiracy theory are typically more likely to believe in others. And if you take the insincere responders out of the research, you take them out of your sample, do you still see that pattern? Or is it actually all just down to people taking the piss and clicking yes to everything? The fortunate thing for people like me who study this is that no, you still see some of those classic patterns that you'd expect. That's a little bit diminished. um So it means that we don't have to throw out all of the conspiracy theory research because there's some trolls in our surveys. But it does also say that we have to be careful about making assumptions about the absolute level of belief about these things, because we know that some people, a small number of people in these surveys, are actually just taking the piss. uh And so we have to be mindful of that and account for it when we talk about the magnitude or how the prevalence of some of these beliefs. Fortunately, it doesn't impact some of the more theoretical questions about why people endorse conspiracy theories or not. Those patterns still hold even when you account for the insincere, giant raccoon believers who are in there. Another way of doing this is asking people at the end of the survey, did you answer all of the questions sincerely? And some people, again, around that five, six, seven percent mark will say, no, actually I lied on some. And so they're honest, yet insincere responders as well. So the researcher that I was involved in looked at both of those groups together and separately. And what happens when you take them out of your research, do you get different patterns? and fortunately not. Patterns still there, but the absolute number of people who believe in conspiracy does go down a bit. Yeah, so that's a funny kind of conspiracy theory that also has an important purpose in trying to ground truth what our surveys are telling us. You say it now, but then when Canada goes to war and raccoons show up I mean, there was like all of this discussion about Canada getting invaded. And then I heard that there was a story about raccoons crossing borders or something like that because of climate change. And I was like, well, this is getting pretty close to actually, someone's just got to genetically modified them to be super intelligent. then suddenly it's not a conspiracy theory anymore. um But just out of curiosity, how many people fill in these surveys? How many people are we talking here that you get to fill in surveys and questions like this about? I mean, for a population level survey, so like the ones we're doing there, it's around about thousand people and that gives you what we call statistical power to look at small relationships between variables. you might measure someone's political ideology and how much they endorse conspiracy theories. And if you want to see if there's a correlation there, you really need a big number of people if it's a small correlation to detect that. So about a thousand is sort of typical in the field now. And a few years ago, that would have been crazy when you were like sending mailing out surveys to people. But now there's an industry driven more by market research and that sort of area that recruits people to participate in surveys. And so there's actually a good sort of mechanism for researchers to get everyday people to fill in surveys. um There is always a question of how representative are these groups. In the work I do, make sure that the demographic spread of people reflects New Zealand or Australia or the country where you're doing the research. So you try and make it as representative as you can, but it's difficult to make it a perfect uh microcosm snapshot of the country. But you try and at least balance age and gender and maybe the region that people are from, income, if you can do that. And then you can also use weights after you've done this, collective people, you sort of adjust. how much, how big or small groups are statistically to try and make it more representative of population. So yeah, but there's always a question and just to come back to that show I talked about before, a big challenge now is AI respondents. So these panels that recruit participants, they get a small amount of money for doing these surveys. And suddenly if you can get, well, I'm just going to get an AI bot to fill it in for me and I get the money. Suddenly you've got loads of people doing your survey, which are just AI agents clicking buttons. So that's a big risk for that whole area of online survey research is how do we handle, how do we make sure that people aren't using AI bots to fill in our surveys? So there's a sort of cat and mouse game of different measures and mechanisms that are being put in place in surveys to try and trap bots that AI agents can now get around. You know, like, they can look at a swirly picture of letters and figure out what the letters are. um And so, yeah, it's an arms race now to try and keep humans in survey research and box out. That's not the first time I've heard that. I've heard quite a few people who do surveys saying this has become a real problem. It's frustrating. And so what you're telling me is about 50 to 70 people said that they believed in internationally modified warrior raccoons. Yeah. oh I don't have the exact numbers there but it was around the five percent mark I think. That is, I would say that's quite a high level of trolling. If you pause for a moment, can actually give you the exact number because I think I had it open last night on my computer. So in terms of the magnitude of this, in a survey of about 800 people, 7 % or so endorsed the raccoon theory. So they said that they at least somewhat believed that the Canadian Armed Forces were creating these genetically modified raccoons. And then about 8 % of people said that they were being insincere. And so if you combine those two groups with a bit of overlap, was about 13 % of people were in some way insincere or we believe were insincere in this survey, um which is a big chunk of people, about 100 people in the survey either said that they weren't totally honest or endorsed something that suggested they weren't being totally honest. um So that does represent a challenge for survey research that you have to think about those people filling in the questions that you're asking them. Hmm. This is very good information because I'm building a survey at the moment, so good to keep in mind. Yeah, I mean, that's actually this is also with attention checks and then removing people who are answering questions incorrectly in ways that suggest they weren't paying attention. So you sort of have little trick questions where you say, for answering this question, please say strongly disagree. And if they don't click strongly disagree, you can assume that they're not really either paying attention or they're not really engaged in the survey. And so it's a way of removing some of those participants. So even after you remove those guys, you still have this insensitia group left, which is a challenge. But you can try and include questions which allow you to then control for that and analyzing your results. very interesting. Yeah, I think in the interest of time, Beth, do you have any last question to ask? No, that was perfect. That was all good. Brilliant. I guess one last thing from me then is if you had anything, any bit of advice to give to our listeners or to us before you leave, what would it be? think like, yeah, always engage in bit of intellectual humility. That would be my all purpose piece of advice. So I always try and check my own biases and think, oh, this makes perfect sense to me, but is that because I'm me and I'm missing something? So it's always good to question your assumptions and yeah, not assume that you know everything. That's my life approach to thinking about research and everything else as well. Awesome. No, brilliant. Well, thank you so much, John, for coming on. I know you've had an early morning as well, all the way from New Zealand. But thank you so much. Thank you, everybody, for listening. And I guess until next time, take care.