Think It Through: the Clearer Thinking Podcast
Think It Through: the Clearer Thinking Podcast
Episode 43: Agnotology--the Study of Ignorance and Manufactured Doubt
In this episode, April explains that ignorance and stupidity are not the same thing. But when doubt is deliberately created as a strategic ploy by powerful entities, the ignorance that results can be not only stupid, but dangerous.
Episode 43 Show Notes
Definition of "epistomology:" https://www.britannica.com/topic/epistemology
Good explanation of agnotology by Dr Mark Crislip: https://sciencebasedmedicine.org/agnotology-the-study-of-ignorance/
Dr Robert Proctor on Alie Ward: https://www.alieward.com/ologies/agnotology
The post-truth world: https://theconversation.com/scientists-have-a-word-for-studying-the-post-truth-world-agnotology-71542
Medium article https://medium.com/@porlando_84392/agnotology-the-study-of-ignorance-351ba8ae432d
Find out about, and buy, Merchants of Doubt here: https://www.merchantsofdoubt.org/
Great article on "the science of spin: https://pmc.ncbi.nlm.nih.gov/articles/PMC7996119/
Michael Facciani's website: https://www.matthewfacciani.com/
Agnotology explains why social media platforms don’t investigate the negative impacts of their messaging: https://www.cigionline.org/articles/social-media-platforms-and-upside-ignorance/
Changes to social media sites involving previously banned content: https://www.socialmediatoday.com/news/everything-to-know-about-meta-political-content-update/737123/
Yes, scientists do agree on climate change: https://science.nasa.gov/climate-change/faq/do-scientists-agree-on-climate-change/
Jake Scott's Senate testimony: https://www.hsgac.senate.gov/wp-content/uploads/Scott-Testimony.pdf
Here are a few good sources I didn't have time to discuss:
How scientists deal with what they don’t know https://onlinelibrary.wiley.com/doi/full/10.1002/evan.21303?msockid=1e22d990d6c764bc2523cc64d77d65db
How agnotology might work to fight disinformation https://www.wired.com/story/agnotology-misinformation-opinion/
Agnotology and political rhetoric https://www.aaronhuertas.com/invisible-messaging-the-power-of-agnotology-in-political-rhetoric/
Podcast Episode Outline
Episode Title: Agnotology: The Study of Ignorance and Manufactured Doubt
Opening
Hi there, and welcome to episode 43 of Think It Through. Let’s start with a question--what does the word “ignorance” mean to you? If someone is ignorant, does that mean they’re foolish, ridiculous, or just plain stupid? While we may sometimes use the word ignorant to mean those things, the dictionary definition of the word is “the condition of being uneducated, unaware, or uninformed.” According to Vocabulary.com, “Ignorance is not a synonym for stupidity, since its meaning is closer to “being uninformed” than “being unintelligent.” Ignorance implies the need to be educated on a particular subject.” So it’s more of a lack of awareness about something, and has nothing to do with someone’s IQ. Since I’m going to be using the word quite a bit in this episode, I just wanted you to understand its true meaning, so you realize I’m not calling anyone dumb. We are all ignorant about a great many things, but that doesn’t make us stupid.
- Now, here’s another question, and this one takes the word in a slightly different direction, which is critical to our discussion today: “What if ignorance isn’t just the absence of knowledge, but is also something that’s actively created? What if some of the things we ‘don’t know’ have been carefully designed in a way that keeps us not knowing?”
- That sounds a little—conspiratorial, and in some sense that’s exactly what it is. And that leads me to today’s topic—a word that you probably haven’t heard before—“agnotology.” We’ll find out what agnotology is, what researchers have found out while doing research in this area of study, and how we can use that knowledge to become more aware and less ignorant about things that really matter. Okay, let’s get started.
(Music)
So, what is this Agnotology thing anyway?Well, before I tell you, let’s talk first about its opposite—epistemology. According to Brittanica.com, this term comes from the Greek word for knowledge, and it is the branch of philosophy that studies knowledge, how we know what we know. This field of study has been around for a couple of millennia—in fact it’s one of the four main branches of philosophy. From Aristotle and Plato through the Enlightenment and into today’s contemporary philosophers, all the great thinkers have contributed to this field. Understanding the nature, origins, and limits of our knowledge is certainly a critical part of understanding ourselves.
So epistemology is the study of knowledge; but agnotology is the study of ignorance. The root of the word is based on the Greek agnosis, which means “not knowing” or “unknown.” Unlike epistemology, which has been around for centuries, the word agnotology has only been around since 1992, when Dr. Robert Proctor, Stanford University history of science professor, was looking for a term that described this new field of study; he was interested in what we don’t know, and even more importantly, why we don’t know it.
Now, there are some obvious reasons why we might not know something-- we may simply have never heard of whatever it is, maybe don’t even know it exists, or have never been in a situation where we are presented with information about it. Dr Mark Crislip, in his blog on this topic on the Science-based Medicine website, says this is what Proctor refers to as “ignorance as a native state,” and it basically means accidental ignorance due to lack of resources or education on a topic. Then there’s another kind of ignorance that Proctor refers to as “ignorance of lost realm,” which has to do with things we select or choose to remain ignorant about. Most of us know a lot about a few things, and maybe a little bit about lots more things. Some of those things don’t really matter to us, so we don’t really try to learn more about them, or maybe they do matter but we don’t have the time or energy to become more knowledgeable about them, and we believe that not knowing about them isn’t going to impact us all that much. For instance, I don’t really know much about how an automobile engine works, and while it might be nice to have that knowledge it’s not going to impact me too badly if my car misbehaves, as I have a husband who knows more than I do about cars, and if it’s too much for him, we can take the car to a mechanic, who knows way more than either of us. While Robert Proctor recognized and defined those kinds of general ignorance, that’s not the kind of “not knowing” that interested him the most. He refers to this third kind as “ignorance by active construct,” which is a deliberate or culturally produced strategic ploy designed to keep people ignorant about something through manufactured uncertainty.
You might be wondering, what exactly does that mean? In a 2020 episode of the podcast “Ologies” hosted by Alie Small (which I highly recommend, it’s a great podcast), she interviewed Dr. Robert Proctor, who described how he and a group of scientists at Harvard in the 1990’s looked at how the tobacco industry had held back important studies that showed their products were negatively impacting the health of consumers. In his own words: “They knew that cigarettes cause cancer, and their whole goal was to create ignorance to stave off people learning the truth by creating doubt, by throwing up a smokescreen, by throwing sand in the gears.” He explained the clever way in which the tobacco industry emphasized uncertainty to create doubt. Back in 1958, they had formed an organization called the Tobacco Institute, a trade association whose entire reason for being was to put out “good news” about tobacco, throw doubt on scientific studies, and lobby congresspeople to derail any attempts to regulate tobacco sales. Proctor referred to it as “a giant misdirection campaign,” and this campaign included doctors who enthusiastically endorsed smoking. For example, some of them recommended menthol cigarettes for people who had breathing difficulties, and some physicians also said women should smoke while pregnant because it caused babies to be smaller and made them easier for the woman to deliver. Additionally, they would recommend that women smoke after giving birth as it would help them to lose the baby weight faster. I wish I was kidding about this, but unfortunately it did happen. The truth about the hazards of smoking didn’t really come out until two important 1993 reports—one from the Surgeon General about the devastating effects of secondhand smoke, and the other from the EPA, which declared cigarette smoke a Class A human carcinogen. Finally, in 1998, the Tobacco Institute was dissolved as part of the government’s Tobacco Master Settlement Agreement, in which the top cigarette manufacturers agreed to stop most of their marketing practices and compensate states for the medical costs of smoking-related diseases. The general public now fully understood the truth about the dangers of smoking; however, the doubt manufactured by the tobacco industry in the latter half of the 20th century led to billions of dollars of profit for tobacco companies, at the cost of millions of lives.
Proctor has also looked at other industries, like chemical companies, the sugar industry, the lead industry. You name the industry, if it includes a product that might cause harm, there’s a trade association for it designed to cast doubt on that harm. And of course, this kind of strategic manipulation isn’t just about products—there’s lots of manufactured uncertainty out there about things like climate change, vaccines, land and water pollution, anything where the protection of public health and the environment might be contested for commercial, political, or ideological reasons.
According to French authors Rose Janna and Marcos Barros in their article about agnatology on The Conversation website, one of the most important aspects of this field of research is “revealing how people, usually powerful ones, use ignorance as a strategic tool to hide or divert attention from societal problem in which they have a vested interest.” As we’ve discovered, it’s largely corporations or other entities, including governments, that value profit and power over people’s physical, mental, or financial health. But sometimes even scientists can get involved in obscuring truth. The 2010 book “Merchants of Doubt” by Naomi Oreskes and Eric M Conway chronicled the efforts of several prominent conservative scientists with connections to politics and industry to manipulate public perception of important scientific discoveries like the link between smoking and cancer, as well as human influence on climate change. They took part in well-funded and organized disinformation campaigns, and they were a big factor in why it took so long for the dangers of smoking to finally be accepted, and why so many people still believe that climate change is either not real or not impacted by humans, despite overwhelming evidence to the contrary.
(Music)
In a 2021 paper from the journal Environmental Health titled “The Science of Spin,” researchers uncovered 28 different tactics used by five different entities (the tobacco industry, the coal industry, the sugar industry, the agrochemical industry, and the Marshall Institute{which is a conservative think tank comprised of scientists with links to conservative politicians). These tactics generated spin that was favorable to their industry while manufacturing doubt about any harm caused by them. Five of the tactics were used by all of the organizations, which led the researchers to believe those tactics were key features that lead to this kind of public ignorance. Those five tactics are:
1. Attack the design of any study that supports the larger scientific consensus. They would look for any flaw, no matter how small, no matter if it was critical to the study’s outcome or not, and emphasize that flaw to say that the entire study was therefore flawed.
2. Gain support from influential individuals—politicians, industry icons, doctors, scientists, and journalists who are ideologically supportive and can influence the public.
3. Manipulate data by cherry-picking from legitimate studies or creating their own studies that are specifically designed to show the results they want (which is definitely NOT the scientific method)
4. Employ hyperbolic or absolutist language when describing legitimate scientific results, calling them “junk science” or “poor science”
5. Influence government laws and regulations by gaining inappropriate proximity to regulatory bodies that make laws affecting these industries
The researchers also found a lot of other tactics used by these industries, such as suppressing any information that incriminates them, finding alternative potential causes for a particular harm so they can point to those things as the culprit and deflect from their own responsibility, taking advantage of the general public’s scientific illiteracy, conducting targeted attacks on their opponents…the list goes on. I’ll post a link to this study so you can look at all of the ways there are to manufacture doubt.
This “manufactured uncertainty” is obviously extremely effective. There are several psychological reasons that people buy into manufactured doubt, including:
- Confirmation bias— which, you all must know by now if you’ve listened to my podcast, is our tendency to pay much closer attention to information that confirms what we think might already be the case, and that can make it easier for us accept disinformation.
- Another reason is Cognitive load—which has to do with the amount of mental effort it takes to process information. If your brain perceives that it’s taking too much time and effort to process a message, accepting the information that leads to doubt may just feel easier than putting effort into investigating further.
- And then there’s Emotional appeal—messages that lead to doubt can evoke strong emotions, like fear and distrust, which can lead people to accept claims more easily.
- Then of course the manipulation of doubt can create a sense of community among people who buy into it, and that just reinforces the narrative and makes it harder for facts to break through. Matthew Facciani, who wrote the book “Misguided: Where Misinformation Starts, How it Spreads, and What We Can Do About It,” says that a big reason why someone might believe something that you can see is obviously false has to do with the need to maintain connections with their community. So it makes psychological sense for them to remain ignorant of the truth.
Social media companies like Facebook, YouTube, TikTok, Instagram and Telegram are a big reason why ignorance continues to flourish. Heidi Tworek, professor of history and public policy, wrote in a 2019 article titled “Social Media Platforms and the Upside of Ignorance” about a YouTube employee who discovered that alt-right videos (which are major sources of negative and potentially radicalizing mis and disinformation) generated as much engagement as music, sports, and gaming videos, and were a huge factor in YouTube’s success. The findings were reported to the company, but nothing immediately happened—in fact a YouTube executive said he didn’t recall seeing that information. However, after the Jan 6, 2021 attack on the Capitol, many far-right groups were banned from Facebook and YouTube, although they found workarounds like Telegram and Rumble to continue spreading their messages. But since the beginning of 2025, both Facebook and YouTube have relaxed their content moderation policies to allow more content that would previously have been removed, and Meta, Facebook’s parent company, has gotten rid of fact checking entirely. This allows mis and disinformation to spread on these platforms with no guardrails. Tworek says that agnotology helps to explain why social media platforms are more likely to accept these kinds of videos and posts. The incentive to ignore them is far more profitable for them than if they actually took steps to regulate or remove them. That’s because they have known for a long time that these kinds of things generate an enormous number of views AND an enormous amount of ad revenue. Better for these corporations to remain ignorant and continue to be unaccountable for the outcome.
The result of all this manufactured ignorance is either continued inaction or even backwards movement on important issues like climate change, public health initiatives, and social reform, and it’s a big factor in today’s political polarization. Think about this—well over 90% of climate scientists agree on the basics: the earth is warming, and humans are a big cause. In fact, newer studies have shown that between 97 and 99% of climate scientists are on board with that statement. But how often have you heard, ‘The science isn’t settled’? The same applies to vaccine safety. The vast majority of physicians and other healthcare providers believe that vaccines are safe and effective. Jake Scott, clinical associate professor at the Stanford School of Medicine, recently testified before Congress that there is an enormous evidence base for vaccine safety and efficacy. Yet the current head of HHS, Robert F Kennedy Jr., continues to push the idea that we really don’t know, that we somehow need to find more evidence that proves vaccines don’t cause more harm than good. That’s not by any means a rational, neutral statement—it’s a deliberate strategy. By amplifying uncertainty, powerful interests slow down public action, and ignorance becomes a tool of delay.
(Music)
Now you know that agnotology is actually a field of study; researchers have spent decades looking at the ways people are deliberately manipulated to remain ignorant about the truth of important issues. Let’s look at how their research can help us use Agnotology as a Tool for Awareness.
- One of the things I tell my students is that when they hear claims and arguments that just don’t sit well with them, they may not initially be able to pinpoint what’s going on. But if you pick up on the idea that the messenger, whoever it is, is attempting to make you feel uneasy, uncertain, or doubtful about generally accepted, legitimate facts. Then you need to ask yourself “Who benefits from me not knowing the truth about this?”
- Do a little research to find out the original source of those claims, you may be able to find an obvious conflict of interest that makes their claims suspicious.
- Check whether uncertainty or doubt is being amplified disproportionately compared to the actual evidence.
- You might be able to recognize some of the tactics I talked about earlier (like cherry-picking data, attacking or vilifying scientists and legitimate research, or reframing the debate to make themselves seem less responsible for harm).
- And it’s always a good idea to learn about logical fallacies in reasoning, as most messages that are designed to manipulate use at least one fallacy, and enhanced understanding of them can help you pick up on them in arguments.
- Being skeptical about the facts surrounding important issues can be a good thing, but you need to be able to distinguish between healthy skepticism and the kind of doubt that arises from manipulative tactics. I talked about skepticism back in Episode 19, so you might want to go back and refresh yourself on that concept.
- And I’d definitely be wary when corporations or other entities continue to claim “uncertainty” about an important issue long after a scientific consensus has been reached on that topic.
The authors of “The Science of Spin” talk about steps that some of the powers-that-be also ought to be doing to help the public recognize disinformation.
- The media, for example, should avoid presenting legitimate evidence alongside “industry-generated manufactured doubt” as if the two views are comparable—that’s NOT balanced journalis
- Scientists should be very cognizant of possible conflicts of interest in their funding sources, and peer-reviewed journals should not publish any research unless it contains a disclosure of any conflicts of interest. This is not to say that all research funded by industry and big business must necessarily be biased or poorly done, but reviewers should know the funding sources and take that into consideration when determining what should be published.
- And where government is concerned, I’ll just quote the article, “…We can ask that the executive branch nominate qualified individuals to lead scientific agencies and advisory boards; that federal officials face punishment for tampering with scientific reports; and that administrations, regardless of their political party, avoid creating hostile work environments for scientific staff.”
- That plea was published in 2021. Unfortunately, in 2025 the current administration has not only failed to do any of the above, it is doing the exact opposite—putting highly unqualified individuals in positions of power, tampering with, or removing entirely, large amounts of data sets and scientific reports, and creating hostile work environments for government and university researchers. This particular administration’s messages to the public contain a sizeable percentage of manufactured ignorance—messages that are designed to spread doubt and fear and keep people confused. So, at this point I don’t believe we can trust that it has our best interests at heart when it comes to the truth.
We are our own best defense against the onslaught of disinformation. Agnotology helps us see ignorance as not just a lack of information, but also as a cleverly constructed phenomenon. This kind of ignorance isn’t neutral—it can be weaponized. Recognizing it is the first step toward resisting manipulation.
And that’s it for this episode. Don’t forget to check out the show notes, there’s lots of great information on this topic that I just didn’t have time to get into. And if you’ve subscribed to my podcast or you’ve recommended it to a friend, thank you. I hope you use the information in this episode to help you think it through.
(Music + sign-off.)