BACK STORY with DANA LEWIS

CONFRONTING AMERICAN RADICALIZATION

January 19, 2021 Dana Lewis Season 3 Episode 4
BACK STORY with DANA LEWIS
CONFRONTING AMERICAN RADICALIZATION
Chapters
BACK STORY with DANA LEWIS
CONFRONTING AMERICAN RADICALIZATION
Jan 19, 2021 Season 3 Episode 4
Dana Lewis

How do we confront radicalization in the internet, and then on the street?

On This Back Story with Dana Lewis, two experts on de-radicalization and confronting disinformation.  

Arie kruglanski is currently a Distinguished Professor of Psychology at the University of Maryland and is the co author of the three pillars of radicalization.  Needs, narratives, networks. 

 Ira E. Hyman, Jr.,Ph.D., is a professor of psychology at Western Washington University.

And the philosophy of public discourse, with Dr. Todd Mei, Professor of Philosophy.

 

Show Notes Transcript

How do we confront radicalization in the internet, and then on the street?

On This Back Story with Dana Lewis, two experts on de-radicalization and confronting disinformation.  

Arie kruglanski is currently a Distinguished Professor of Psychology at the University of Maryland and is the co author of the three pillars of radicalization.  Needs, narratives, networks. 

 Ira E. Hyman, Jr.,Ph.D., is a professor of psychology at Western Washington University.

And the philosophy of public discourse, with Dr. Todd Mei, Professor of Philosophy.

 

Speaker 1:

Then narrative tells you how to do it, do it by fighting the deep state, fighting the Jews, fighting the Muslims, fighting the refugees. And there's a network that supports it, including high powered people like leaders, the country, like the president of our country. Then you have a social movement and that becomes very dangerous. And you've seen the culmination of that kind of process in the assault on the Capitol on January 6th .

Speaker 2:

[inaudible] hi everyone. And welcome to backstory. I'm Dana Lewis. The attack on the Capitol was evidence of organized radical groups, threatening the state, make no mistake about it. There was nothing spontaneous about the attack it had been talked about on the web for weeks. The groups include dangerous white supremacists with international leaks, driven by what some of them consider to be their spiritual leader of the outgoing American president. Donald Trump. Think about that. I mean, it's beyond bizarre to me, an American president holding radicals to Washington, to storm the Capitol and lynched the vice-president and killed other lawmakers because he didn't like the outcome of the election. He lost, he didn't openly call for a lynching, but he pointed those groups to the Capitol after firing them up and telling them they had to fight. He called them to Washington in the first place. And he knows those groups by name. Trump has given them a narrative breeding, their hate for immigrants, minorities, and the government. They believe cuddles them at the expense of what they falsely believed to be true. Americans. There were police in their ranks and firemen and army vets wearing body armor and carrying bear spray and firearms and pipe balls . The fact is police and some military are deeply infiltrated by white supremacists and the FBI knows . And some of those white supremacist groups organize internationally. For example, in September over two dozen police officers who were suspended in Germany after the discovery of 126 violent Neo Nazi images in WhatsApp chat groups, there's been a resurgence of white supremacy in that country's military and law enforcement groups and a resurgence of civilian supporters over recent years, the North Rhine Westphalia officer's shared images of get this swastikas

Speaker 3:

Refugees in gas, chambers, and reportedly an image of a black man being shot. This is a country that led us into a world war and America fought the Nazis. So what on earth are we doing tolerating these groups within the America of today? What is wrong with people who indulge in this hatred? Well, the fact is if America was a psychiatric patient, I suspect it would be deep in therapy right now. So on this backstory, two psychology professors on the topic of radicalization and disinformation, and we talked to a philosopher on public discourse and just how ugly that's become.

Speaker 1:

Eric Kruger , Lansky , uh , is a distinguished professor of psychology at the university of Maryland. And is the co-author of three pillars of radicalization hiring. Hello, good to be with you. Thank you. I don't want to give away the three pillars, but they are in the title, which are the first one is, needs the need. The second one is the narrative and the third one is the network. Let's start with needs the needs, the need that underlies much of social activism and a violent extremism in particular is the quest for significance. It's the human quest for dignity and respect. It's a universal need can be accomplished in many different ways. All people have it. Uh , the way it is accomplished is determined by the two remaining ends of the three and triumvirate then narrative and the network. The narrative basically tells you how to satisfy the need . What do you need to do? Uh , do you need to , uh , embark on a constructive career? Do you need to fight for your race for your country against the enemies that are taking over , uh, that are promoting injustice to your group? And the third end is the network, the importance of the network, it's the social movement. Uh, and then the it's importance is that it validates the narrative. We are social beings. We need validation by people who we respect and whose good opinion we seek. Uh, so th the importance of the network it's leaders , uh , people who we admire and who want to be admired in return is very important. So the three ends once they come together, once the need is aroused by humiliation or opportunity for glory, and then narrative tells you how to do it, do it by fighting the deep state, fighting the Jews, fighting the Muslims, fighting the refugees. And there is a network that supports it, including high powered people like leaders , uh , of the country, like the president of our country. Then you have a social movement that becomes very dangerous. And you've seen the culmination of that kind of process in the assault on the Capitol on January six ,

Speaker 3:

Was president Trump, the author of the narrative, or did he simply pick up on it and understand what he was feeding?

Speaker 1:

He was not the author of the narrative. The narrative existed for many years before the white supremacists, the Neo Nazi movement, the conspiracy theories have been thriving throughout the decade. There has been a huge uptick in a far right plots by 320% on recent count, but he accelerated that process by legitimizing it who, if not the highest person , uh , in the realm , uh , the most powerful , uh, leader in the world , uh, is credible as an endorser of a narrative. And so he joined the narrative. He , uh , encouraged , uh, its supporters. And , uh , the rest is history. The rest we've seen the acceleration of the movement over the last four years. And in the last phase, following the elections that it entered the new, highly accelerated phase that finally led to the events of January.

Speaker 3:

You draw some parallels with history like Adolf Hitler, who , um, you know, went back in 1918, that Germany was in fact, he said was winning world war II . Uh, and then he started betraying , uh, you're only to be betrayed by Jews and socialists and et cetera.

Speaker 1:

Yes. Uh, you know, Dana, it's always a dangerous draw, historical analogies. However , uh , the leader is not to party it's dangerous not to, and there is a good justification to do it because even though the circumstances, historical circumstances are always very different. They fundamental human nature does not change since the homosapiens appeared on the scenes. We have been pretty much , uh, uh, endowed with the same nature. And that nature is vulnerable susceptible to a process that culminates in extremism. And w what, what particularly , uh , troubles me is that , uh , a fringe group, like the Nazis in Germany , uh, that started as a small group of stags , uh, you know, embracing the idea that there was a knife in the back of the German nation , by the Jews and socialists . Uh , then over time, gradually crept into the mainstream, through the pooch in Bavaria, in Munich , uh , on November 8th, 1923, slowly, it gained momentum failure , Bavaria. Yes. Finally it gained momentum to the extent that it overpowered the entire German nation. And we have seen the outcome of that, the second world war, the Holocaust and the rest of it. So, you know, what troubles me about the current events in the United States is that this fringe movement and now commanded a large chunk of the American population, 74 people voted for Trump. Many believed that, that in the conspiracy theories, that the elections were stolen , uh , clearly that narrative, that the conspiracy theory , uh , crept into the Republican party, a large chunk of the Republican party still insist that there were problems with the election and it was legitimate. So, you know, whereas not everybody in that crowd was violent and some people are , we're just protesting the fact that over 60 million people supported this stop, the still narrative , uh, creates a huge recruitment pool. They shared with , with , uh , the most violent , uh, demonstrators , uh, the idea that there was an injustice done to the American people, that there was , uh , a plot against America. So this creates a recruitment pool. These events are glorified in a sane as, as a great achievement. Uh, so, you know, th th the process of creeping into the mainstream is probably intensified by these events. How do you counter that? You just shut their network down and , and drive them underground, or how do you reach out to these people? Do you need to reach out to them? You need to reach out, you need to , uh, to , uh, first of all, cool down the rhetoric , uh, reduce the vindictiveness on the one hand, you need to hold people accountable. On the other hand, it should not occupy the center stage, and you should , uh , put together, put in place a process to [inaudible] the , the American population. Uh, it has to start in schools, communities, churches , uh, the police , uh , a large initiative needs to be taken a place , uh , in which the whole society is involved. It's going to be a very difficult process to put that genie back into the battle.

Speaker 3:

Mary , there really isn't a call for that. There is a call for justice. There is a call for making people pay and putting them in jail and arresting them there. Isn't a call for that dialogue .

Speaker 1:

Yes. Well, at least it's a minority call. What is encouraging is that the president elect by then seems to be of that mindset of, of reducing the, the heat of the rhetoric, inducing the vindictiveness and eh , becoming pragmatic about the problem is the beset, the country pragmatism, as opposed to, eh, ideology , uh, dealing with , uh, with the vaccination, with the coronavirus, dealing with the economy , uh, and hopefully, you know , he will need in that enterprise, eh , co cooperation of the Republican party who alone can maybe put a stop or at least slow down the process. We haven't seen much evidence for that. There is some, but perhaps, you know, with Biden, Biden is by the way is a , uh, a very good choice this time and age, because he's a pragmatist. He has plenty of experience. Uh, he knows everybody , uh , and he knows the ropes , uh, with these experience , uh , in the Congress and in the white house. And he's a very likable person. Unlike Hillary Clinton that somehow , uh, was, was perceived in negative way by many people, it's very difficult to dislike Joe Biden. So I think all of these elements bode well for the possibility of cooling it down and the back to pragmatics and cutting down the ideological heat that has been overtaking America. Are you a fan of

Speaker 3:

The fact that some of these social networks were pulled down and other people, including the president were banned and lost their accounts or had them suspended?

Speaker 1:

Well, you know, it's a , it's a moral

Speaker 3:

Dilemma. Do you revoke

Speaker 1:

People's freedom of speech , uh, is , uh , the , the , the speech that is being propagated , uh , in the category of hate speech , uh, this is always a very soon line. Uh , so I think that banning Trump for speaking from speaking is a good thing at the end of the day, especially at , at this moment, especially at this moment, but one has to be very careful , uh, with, you know , not being too much of a sensor , uh, of, of , uh , expressing opinions in the United States. They

Speaker 3:

Recruit Glenn Stuckey. Thank you so much.

Speaker 1:

Thank you very much, Dana. It's been a pleasure.

Speaker 3:

All right. IRA Hyman is a doctor, a professor of psychology at Western Washington university. Hi IRA. Good morning, Dana. You wrote a lot of a disinformation campaigns, and it seems like you are living , um, in, in , uh , in that movie right now because the American public , um, has been, you know, their head spun around by president Trump talking about false elections and cheated boats, and people don't know what to believe. A lot of them would you agree?

Speaker 4:

Yes. To a certain extent, right. I think that what happens is we get into an information ecosystem and within that information ecosystem, things hang together. There's a coherent and constant set of information , uh, that people come to believe and they tend to , uh, rely on certain news sources. So sane people thinks people don't know what to believe. Most people have made up their mind. Um, and it has to do with , uh, the environment in which they have found themselves. So it's not that they're comparing alternatives to these sorts of things. Uh, if they're stuck in the information bubble that tells them that the election has been stolen, they're not seeing the other information so that they're not being exposed to it. And when, and when they are, it's , it's only if it's dismissed. So it's , um, uh, I think that their critical thinking works against them here in an odd way when they're in that environment.

Speaker 3:

Well, I guess they , they need to have their head spun around maybe, and be a little more open to being a better news consumer. But at this point, you know, people follow presidents and , uh, for instance, I mean , uh, convinced of , of false information from president Trump in terms of not, not wearing masks and , uh, that, that was dangerous as was, you know, that your vote has been stolen and March on the Capitol .

Speaker 4:

Yeah. I mean, the , um, the, the election , uh, disinformation stuff about , uh, the , the votes not being right and about there being fraud and all of this was a disinformation campaign with the political goal, and it has consequences. The COVID misinformation and disinformation campaigns are killing people. Um, I think we can just bluntly say that , uh, because what we've done is, or what some people have done, and Trump has been the single most , uh, responsible individual for sharing misinformation about COVID is we've given people wrong information. We've told them it's not that dangerous. It's not communicated in this fashion. And then maybe you don't need to wear masks. Um, and, and all of these things are incorrect and they, they lead people to behave in such a fashion once they're in that environment. And that same information they hear that increases the spread that puts themselves and people they care about at risk. So it has , uh, been responsible for not all 400,000 deaths in the U S but a large proportion of them .

Speaker 3:

You wrote the people have been swimming in a sea of lies.

Speaker 4:

Okay . Yes. Um, I think that's a good way to phrase it. And so, you know, your comment a few moments ago about maybe people need to be better consumers of information. Yes. However , um, there's this phenomena in psychology referred to as the fundamental attribution error, where when we look at , uh , a situation where somebody makes a decision or behaves in a certain way, we could say it's because of who they are and the way they are, and the way in this case, they approached news and information. But we can also reflect on the fact that it's, it's about the context about the environment, about the system in which they're working. And in this case, for many of these people, the system in which they're working is what's really driving this because they're constantly exposed to misinformation and misleading information, and they rarely get exposed to accurate information. And in that environment, I mean, you can look at people and say, well, they shouldn't be there, but everybody that they know is in that environment. And they they've shut themselves off from other sorts of things, but it's because that environment hangs together. And so , um , yes, people are somewhat responsible, but they're in a world that is more responsible. It's the people promoting the disinformation campaigns that we need to focus on. Um, it's the , the, the people spreading the misinformation that we need to focus on. It's on disrupting that flow of disinformation that we need to focus on disrupting the disruption. But yeah,

Speaker 3:

The immediate environment in which false beliefs are repeated , um, is because of these algorithms. So if you express an opinion that that algorithm will likely bring you more of your own opinion in your online peers, who you've never met and probably will never will your online peers who would agree with you, they are

Speaker 4:

It . Then that is a big part of the cycle, right? Is that once you , you start down this pathway, the algorithms that the, that pretty much every internet company is using it is going to continue to feed you more of the stuff you like. And to keep you engaged more extreme versions of some of the stuff you like. I know that some of the companies are working on this, right. And they're , they're trying to , uh, to tweak their algorithms to a certain extent, but, but let's be clear, it's this isn't , uh , I think you mentioned free speech before this, isn't an issue of free speech because the algorithms are already choosing for you though . We're already picking and choosing from amongst all the stuff out there, what to present to you. And so for that reason, you know, you talk about just tweaking the algorithms a little bit. It would tone down , uh, you know , uh, the disinformation campaigns and increase your access and your exposure to more accurate information.

Speaker 3:

I'm sure people, look, I tend to agree with you, but I think people who would disagree with us, I would say it's, it's censorship. Um, and at , at a certain point, it becomes dangerous programming . It means first you ban a president from a platform and then who is next and what is next.

Speaker 4:

So , um , due to re look , I'm not a constitutional scholar. And so , um , I'm not going to play cause you don't have to be so sorry . I'm not gonna , I'm not going to play one. Uh, but I think that you have to recognize that the internet companies are already picking and choosing what to show you, because we're not going to show you everything. Um, and so in that sense, it's a question of what, what you want that algorithm to be based on. And to what extent you want , uh, accuracy and reliability to play into that outcome . Um, the second part of your comment about if, if you, you know, deep platform , some people move them off of the platforms , um, you don't just do it to, to anybody and everybody. And actually the data, you know , that I've been looking at. There's some really nice work by Kate Starbird and her team down at the university of Washington. A lot of the misinformation and disinformation campaigns go through just a few narrow nodes on that internet. Right? Uh, uh,

Speaker 3:

I love this because, I mean, I, I've kind of been overwhelmed doing stories about Q Anon and right . People who, you know, are, are finding some pretty wild cult, like things to follow on the internet and your , you, you have said and written that you and , and quoting her. I think that you think it comes down to like the mob that moves down the street. That's probably a central core of people that really drive the trouble and that would be a hundred people.

Speaker 4:

So when we're looking at, I mean, there's two aspects of that. It seems to me, one is trying to stop the wide spread of disinformation on the internet. And that wide spread of disinformation starts with a few people who post something. And then it gets shared widely from those few people. And they may not share that many things, although Trump chaired a lot personally, but it's the fact that they are both the people who share a lot of misinformation and conspiracy theories, and they have a wide sphere of influence so that they touch on a lot of other people. So are a few really just remove , uh , you know, a couple of handfuls worth of people. You can see pretty meaningful effects , uh, in terms of disrupting that, that sort of starting point for a lot of the misinformation campaigns now, in response to your other point about, yeah, there are some, some places on the internet where conspiracy and disinformation is, is the , the main part of what happens in those discussion groups, but they're not , um, they're not the main platforms that the internet. And so in order to get to those places, you actually have to go looking for them. Um, and as you noted ,

Speaker 3:

Which brings us to another question, but which would be, is it better to have these huge companies that you can then pressure and try to get them to police, or do you break them up, which a lot of people are calling for. And then you have, you know, some real outliers in terms of, you know, who they're going to appeal to and how bizarre that that forum may get.

Speaker 4:

Uh, I , I, I'm not going to answer , I'm not going to touch that question because, you know, I'm thinking about those

Speaker 3:

Questions and breaking up the big tech. All right .

Speaker 4:

Yeah. I mean , um , you know, I , I know my areas of expertise. I know a lot about the spread of disinformation and how people adopt it and the fact that it has consequences. I do know that disrupting the spread of that can be effective and, you know, leading to a better set of information that people are relying upon , um, the actual pathways in terms of legislation about, you know , how it fits in with constitutional structures in any given place. Yeah.

Speaker 3:

Um, after the, the assault on the Capitol , do you think that people who didn't want to pressure companies and didn't want to go down this road of first amendment debate now say, if we don't do it, we're not going to have a country.

Speaker 4:

Um, you know, I think that maybe we've been coming to this, and it's hard to , to know for sure if you turned the corner, I, you know, I've seen numbers on the internet , uh, you know, reports of some studies that people have done in the last week that showed that there was a 73% drop in misinformation at related to the election after , uh, the social media companies removed, not just Trump, but a few, few other people and some of the Q and on conspiracy groups. Um, I mean, it's, it's, it's hard to attribute it directly to their actions, but it , it seems that it's part of , uh, decreasing the spread of the misinformation. The reason I say it's hard is because, you know, things were kind of clear that everything about the election of certified, so maybe we would expect anyway, a drop off of conversations about that. Um, but I mean, I am hopeful, you know, I , um, and it feels a little bit hopeful to me for the first time in a while that maybe , um, the success of some of that, and the willingness to recognize that , uh, not everyone is a good faith actor and when they're not good faith that there's, maybe they shouldn't be treated as good faith actors , uh, so that if they're constantly spreading disinformation and they have a wide appeal, we need to , to address those sorts of things. America was your ,

Speaker 3:

Um, from a psychological point of view, would you say that you have a , you have a pretty ill , uh , patient right now ,

Speaker 4:

Um, in terms of , uh, the sort of adoption of misinformation. I think that we have a problem with, with misinformation, disinformation campaigns broadly in the U S yes. Um, and it's not just the election, it's not just, COVID, it's the issue of , uh , whether or not there's climate change , uh, and, and how to address those sorts of things. And , and one of the features here is that the people who spread one set of disinformation and misinformation spread the other ones as well, and the people who adopted one get exposed to all the other ones as well, so that it it's a , um, a nasty information environment right now in the,

Speaker 3:

As a media person. When I said, be a good news consumer, I should tell you, you brought up , um , global warming. As an example, I can tell you that there was a long debate about it here, and the BBC decided they are no longer going to treat the debate as two sides anymore. That 99.99% of scientists around the world, believe there's no question that there's global warming, and they're not going to give equal time to the people who say there's not. And so media has had to mature a lot. The old, you know, when I went to journalism , journalism school was just giving two sides all the time and then let the public figure it out. Unfortunately, we've been pressured to kind of go beyond,

Speaker 4:

Well, I was duty. The question is whether both sides are acting in good faith , um, whether or not both sides are relying on the actual state of the world and presenting the actual state of the world. What's the, what's the bit that if you're a journalist and you've got one person who say that it's a beautiful day outside, and the other person say that it's pouring down rain, your job isn't necessarily to give them both equal time, but they'll look out the window, right. And to prioritize the one who's given you accurate information, perhaps. Um, I mean the other way to think about it. And I saw this last week with , uh, uh, the attack on the United States, Capitol news companies had a choice about which voices to feature at that moment. Um, and then again, in the debate , uh, uh, about whether or not to impeach president Trump again, so many congresspeople spoke, the news companies have a choice about which voices to present. Um, so which one should you prioritize? You're not going to show all of them, you know, if you're , if you're

Speaker 3:

To show some of the dissenting

Speaker 4:

Voice yes.

Speaker 3:

Equal time with those who are overwhelmingly voting to impeach

Speaker 4:

Well, or, or even when you're presenting the dissenting votes, should you, should you present someone who is , uh, a fountain of lies that everything that they say is misinformation, or should you present someone who says, you know, other reasons for doing this, that aren't based on , uh , spreading further disinformation theories? I mean, it's , uh , it's, what are your ethical obligations as a journalist? Um, I think that that is a critical question that journalism has to, to address in this moment. Um, the, the, the treating both sides equally is reasonable when both sides are , um , equally good faith actors .

Speaker 3:

Last question to you. I think the last paragraph of an article I wrote was kind of inspiring in a sense that yeah, people are, you know , locked down and going through a lot right now. And you said your suggestion would be change your information feed for the next month. And,

Speaker 4:

And this is particularly for people who , who thought that everything that I wrote in that was wrong, right . That

Speaker 3:

Those comments, I will note, because you thought that some people would be not.

Speaker 4:

Yeah, no, really receptive, right. Not receptive. And so the , and I , I generally leave comments open on things I write, and I'm usually willing to, to, to reply to people and I will respond to people. Um, and, and in part it's because as a white man, I don't get attacked as much as some other people do. Um, but here, I don't need to , to give a platform to people who just want to spew more information. And so if you're that angry at me for saying that, no, the election wasn't stolen. Yes. COVID is dangerous. Yes. There's climate change. Then maybe what you should do is switch your information feed for awhile , because, and this is something that worries me deeply when we're talking about this. Many of the people who have gone down these pathways have lost touch with friends and family. Um, and I think that we need to recognize that human component of it as well, that they have now found a group where they feel comfortable, but it's a group that shares these beliefs that are not consistent with the state of the world. And because of that, they , they no longer necessarily in touch with friends are no longer in touch with families. They've , they've lost some of the other connections. Um, and maybe because they drop them, maybe because their friends and family really just gave up , uh, and talking to them. And I've, I've heard stories like this. I've seen it to happen to people who I know on social media. Um, and so, you know, wouldn't it be lovely if changing their , their information flow for awhile , led them back , uh, to some connection with reality and allow them to reconnect with friends and family. Um, you're not going to break a single individual. You Dana, you're not going to break through that environment for one person who's completely inundated in it flooded by that tidal wave , but disinformation , it's going to take more than that , um, to do that. And so they're going to have to move back out of that , uh, flood of disinformation.

Speaker 3:

Yeah. I mean, I come from a family where people were political views were a blood sport at the dinner table, you know, but I think in general, you still had, we still taught each other to be generally respectful, not always generally respectful of other people's views at the table. And they actually tried to listen to them. Right. Because otherwise, why are you holding the conversation at all? If you just go on the internet to give your opinion, then , uh , you know, you're , you're really not engaging with anybody you're just engaging with you .

Speaker 4:

Yep . I would hope people would be able to do that again in the future.

Speaker 3:

Yeah. Well, great advice from you to change your information for a month and re re-engage with people that maybe you don't agree with. Right. See how you feel at the end of the month or six months or wherever it takes you. Right. So I would

Speaker 4:

Hope that we might see some progress that way.

Speaker 3:

IRA Hyman , a professor of psychology at Western Washington university. It's really important to talk about where we are within information and to get people to think and just reopen themselves. I think so. It's great to hear you. Thank you so much.

Speaker 4:

It's a pleasure talking with you. Thank you.

Speaker 3:

Todd may joins us now from Nevada. Uh , and he is a former professor of philosophy here in the UK. He was at Kent university where he was a professor and he's now living in America and he has philosophy to you , philosophical counseling, facilitation, and coaching in meaningful work and business ethics. Hi Todd, I Dana pleasure to be here. Your head must be spinning, right? Because you left the UK in September and you've landed now in the middle of what was an election campaign. And America's just been spinning on its head ever since.

Speaker 5:

Yes, it's been sort of crazy. Um, and obviously a lot of things have changed for both the us and the UK since September. And I think what stood out mostly , uh , when we first got back, it varied state to state just the different practices and guidelines and policies that were in place for social distancing and being safe. And to give you an example, we're not in place or not in place and on the law. On that point, we had to trouble through Utah, one point by car, and we stopped off in a small town. I won't say which one, but nobody was wearing a mask. We , uh , my wife and I were the only two people wearing a mask. And when we went in , uh, to a restaurant to get some food, to take out , uh, we, we got stairs probably for various reasons, but no doubt because we were wearing masks and it felt very unsafe. And just going across the border to Colorado, we had to stay in a hotel and then it was entirely different. There were signs up , um, you know , make , making sure that people were social distancing and wearing masks. So very different cultures and climates, depending on what state you go to. And , um, my own point of view , uh, anecdotally is I often wonder outside of large cities, why there's such a problem in the United States, because unlike most European countries, there is so much geographical space. And it seems like it's so easy to social distance and keep to oneself , uh , because of , of, of the way in which , um, one's houses are located, the amount of property you have outside of big cities, it just seems like there should not be a problem with social distancing, but , um , that's obviously not the case, given the infection rates in the United States,

Speaker 3:

Philosophically, since you were the philosopher. I mean, this was generated by a president who saw mask wearing , um, as some great infringement on people's rights or at least he sold it, believing that that would bring him more votes.

Speaker 5:

Yes. And I suppose the problem from my own perspective is that there's a real lack of understanding and awareness of what philosophers like to call public reasoning public debate. And it's exacerbated by the fact that social media is most of the social media companies are aware of this kind of thing. In fact, I, I emailed one of the , um, legal officers of Facebook at one point , um , trying to , um, ask him whether I forgotten his name. Now I've asked them about whether they had Facebook had an interest in this idea of public reasoning and how it, what it meant practically for something like Facebook is that there would be clear rules of how the people on Facebook would engage with one another. There's a , in a simple sense, there's etiquette , uh, there's ways in which you respond in ways in which you, you speak or , or write to one another on these kinds of things . Very basic rules of engagement in conversation that many generations prior to ours are probably very much aware of. Um , and there's just no interest on that. On from social media companies

Speaker 3:

There might be now taught, right? Because I mean, for a long time, it was first, well, you know what I think first amendment was an excuse, but before first amendment freedom of speech issues, it was simply Facebook didn't want to police this stuff. Um, and neither did Twitter and they didn't want to bring in the manpower and the women power to , to go through all these horrible , uh , you know, messages and messaging and, and groups of people. Um, and now suddenly they've really been brought to the precipice. Right. I mean, would you agree now that probably you should be calling them back?

Speaker 5:

Yeah. Maybe I'll try that after, after the podcast. Cause , cause like , yeah . So I think , um , there is some kind of social corporate responsibility that has to take place and they really have to focus on that. But there's also the other side. It's not just the companies because you hear a lot of blame about company, social media being a problem, and it is problematic. And I have to admit I'm a little cynical on that side, but it's also the responsibility of citizens and the idea of public reasoning and in public debate is that you're very much aware of the kinds of claims you're making in the public sphere. A lot of people today, when they're out on social media or they're out in public, they like to express their convictions very strongly, either way, you know, it can be right or left and foster is , are interested in the way in which a lot of our beliefs and attitudes don't have the kind of grounding and certainty that we often think they do. So we often teach philosophy students about what's technically called forms of skepticism, but it's not skepticism where you just throw your hands up and you say, well , there's nothing I can, I can say or do because it's very difficult for me to prove something rather than the role of , of skepticism as it comes from ancient Greek and Hellenistic philosophy is that you're very much aware of the weaknesses and fallibility of your reasoning. So you can't just go out there and say, this is, this is true, no matter what that's fake news, or this is a violation of my rights. Mass scoring is a violation of my rights. Now there's always a kind of awareness of you might be speaking from a certain point of view. And when you, when you engage with another person, they're speaking from a certain point of view, and it's the idea that knowing comes from an in cordial or certain , um, point of view that has some kind of access to irrefutable knowledge, but what you get a lot in the public sphere is the lack of that lack of awareness of what knowledge is and how to justify beliefs. And so what replaces that are a lot of emotive comments are a lot of emotive reactions , uh , and that tends to galvanize people. And I'm not against emotions. I think emotions have a very important role in our , in understanding others. But when you have too much of that, you get what we have today, not just to ,

Speaker 3:

How do you go about teaching, you know, humility and respect for other people's opinion , um, in, in the climate that the political climate in America, right?

Speaker 5:

Yeah. There's gotta be a lot of approaches. I'm always a big fan of a liberal arts education. Um, but, and I don't, and the problem with that is that

Speaker 3:

You said the liberal arts, liberal arts liberal word, Oh my God,

Speaker 5:

Well, it's, it's, it's very different. It's not political. So it means , uh , technically or originally liberal arts is the liberating arts. So you engage in these forms of , um , academic disciplines, like philosophy history and the imaginative arts , uh , literature and so forth. And it's the combination of all those , um, exposure to that that actually liberates your mind. It makes you more capable of engaging with one another. And what we have today is the lack of that capability of people to engage with one another. It's just people throwing up walls , um, being angry , um, condemning people for, for holding a point of view, or you see someone and you have this reaction of, Oh, that person must be a socialist or that person must be a right-wing fanatic, that kind of thing. And there's, there's, it's just , it's too reactive. And so the role of , of reasoning helps to space things out a bit now. So there's, there's the liberal arts education, but a lot of people don't have access to that. And so that I don't want that to be a kind of isolating or alienating view. I mean, it'd be great if education were universal in that sense, but there are other things that people can do just on a daily practical basis. And it's , um, making sure that when you engage with someone you hold off on any quick reactions and you try to understand the point of view , um , from which someone else is speaking. And there are certain kinds of questions and phrases that can help facilitate that. And simply asking, I don't quite understand what your view is. Um, in , in here are some simple questions and in turn , instead of antagonistic questions, they can be open-ended questions. Um, and the other thing I've noticed in conversations with people who hold different views is there, there tends to be a lack of patience in space. And so I think the other thing is identifying those kinds of conversations where that patient's is simply not there and just removing oneself from that. And that, that happens a lot on social .

Speaker 3:

It was the whole digital world, but that , I mean, it's interesting some of the analysis where , where people talk about the fact that, you know, in the end, you used to interact with your neighborhood. So your neighborhood, now you probably close the door, especially in a pandemic, but we've closed the door to a lot of our neighbors and more kind of human interaction. And now we're in these chat rooms and, and digitally, and there's something about that environment that gets people not to listen, but to express opinion,

Speaker 5:

Yes, very problematic. And , um, it , it comes down to self restraint and discipline and, and understanding how to be virtuous on social media, which is very difficult , uh , very hard not to react. And then again, excuse me, it comes back to the social media companies and they have to ch I mean, at the very least, I don't know too much about the tech side, but they've got to change the algorithms by which the only things that appear are things that seem to be your peers.

Speaker 3:

I hate Trump. I'm not saying that personally, but that will bring you a lot of people who hate Trump or I, you know, I hate Biden that will bring you a lot of people who are in that echo chamber of, of, of the technology to keep you engaged. And they try to keep you engaged by bringing you opinion that they think will, you know , agree with yours. And there's lots of things that are dangerous about it in a bigger scope in terms of democracy. Um, do you think that we're entering this period of kind of like mob rule in democracy that , that Plato talked about, and maybe you can , he wasn't, from what I could understand, the Play-Doh, you, you spend a lifetime studying this stuff. He didn't seem to be a huge fan of democracy in the sense that he felt that it would degenerate into anarchy.

Speaker 5:

Yeah. So the , the comments on democracy , uh, often are taken from the Republic, which is a very interesting text and , um, got a lot of different ways. You can understand what's going on there. Um, Plato , let me just say Plato is very egalitarian in many ways. I mean, it seems to advocate a form of feminism or protofeminism , um, depending on how you read them and then the , the public philosopher, Martha Nussbaum comments on that, but it played a , had two worries about democracy. And I don't want to pigeonhole miss saying, he's just out now, antidemocratic , uh , someone like Karl popper might say, but it's two concerns are one, is that the amount of freedom that occurs in a democracy can while it's a strength, can actually be its weakness. Now I'll come back to that in a moment. And the other weakness that , uh , played a worried about was that there could be the dominance of the emotions and the lack of what I spoke about earlier, the lack of public reasoning to help balance that out. So going back to the first one about having too much freedom, so it's the best way to capture that is this idea that everyone seems to want more choice, and you often hear people, if I can say proponents of neo-liberalism saying that choice is good, the more choice, the better. And , um , it sounds on the face of it. That sounds really nice, but what it presupposes or what it presumes is that the people who have the choice have some kind of either perfect rationality or access to perfect information, or that they themselves are capable to reason about the choices. And that's not always the case. I mean, you can just, when you go to buy a used or new car, you might have a lot of information, but you don't have all the information to make the best choice possible. So the choice can often be , um, uh, can often cause a lot of problems and we might make the wrong decisions. And , um, also what , uh , Hannah rent comments that what you have in these kinds of situations is there tends to be a lack of concern for the common good and for concern for others. And what tends to replace it is this form of just following the freedoms that arise in society, whether they're brought about by companies or whatever it might be. And so Hannah rents is what replaces this concern for the common good is a form of consumerism or behaviorism that just simply follows the choices. And so we're not very discriminating in the things that are presented to us. And so going back to play to what happens is you have , uh , an erosion of the civil society. That's no longer focused on what is authentically or genuinely good for everyone or for, for the city state, but just simply panders to our appetites.

Speaker 3:

And it doesn't help if you're making choices based on false information, such as the president telling you that the choice you made in an election didn't count or was cheated. Um, so do you roll disinformation in there? And , and , uh , chaos becomes supercharged.

Speaker 5:

Yes. And we need , um, not just institutions, but individual citizens need to know the ways in which they can vet information that's coming across their computers or the TV screens and , um, your, your, the journalists and media. And , um , one of the big differences I've noticed between the United States and the UK is that , um, in the UK, whether you're listening to the BBC or sky, for example, which tend to be both left and right , um, just left and right of center, you can listen to either of them and they tend not to yell at you, or they tend to let in a lot of the , the major programs will often , it's an op-ed piece. They'll let the new speak for itself. They won't comment , comment on it and try to sway the audience. When were the other back in the States

Speaker 3:

Watching I was watching Pierce Morgan this week. So I'm not sure you're right.

Speaker 5:

Yeah . Well, I , I , I suppose , um, you know, there's exceptions everywhere. I , maybe I can, if I can generalize, it seems on the whole , um, the bit more civilized, a little bit more civilized, whereas here, I just feel like you can turn on anything from Fox to CNN, and they're just yelling at you constantly

Speaker 3:

Question to you, Todd . I mean, if you reach deep in your philosophical soul and roots, and you look at what's happening at the Capitol , um , and where America is headed , uh , what do you draw on to be positive or to be negative?

Speaker 5:

I think , um, uh, and I think it's reflected in some of the comments you've made on social media is that we have to find a way to stop this divide of , of just thinking that's either us or them and that's it. And , um, there are different philosophical tools that we can use to , to help initiate that. But let me just end on a practical note, and this is not my idea. I wish I could take credit for it, but I was speaking with a former senior officer who we used to work for the FDA here in the United States. And he recognized that look, the reason why this is happening , um , is largely due to the socioeconomic divide. And we have to find a way to bring different citizens of different classes, socioeconomic classes, and races together. And he thought that the idea of mandatory national service, which could either be military or civil with bring the different classes together and work together and do different projects. And I thought that was a wonderful idea. And it was in view of something called, you know, your, your nation, whether it's United States or not. And it's those kinds of experiences that can open many doors and can expose people to things they'd never been exposed to before. It can bring people together,

Speaker 3:

Tough sell, tough sell America, but Todd may , um, thank you so much. Good to talk to you. Thank you, Dana. Thank you for having me in there .

Speaker 2:

That's our backstory on radicalization in America, the guardian newspaper here in the UK reported that as many as two thirds of us adults between the ages of 18 and 39 didn't know that 6 million Jews were killed during the Holocaust. Another 23% of respondents said they believed the Holocaust was either a myth or had been exaggerated, or we're unsure time to roll out a reality check for those with radical views or people that just don't know anyway, keep your sense of humor. And look forward this week, Trump was out of the white house and the pounding of the drum of division and disinformation has been replaced to some extent by some normalcy in Washington. I mean, with the suspension of his Twitter account, it feels less stressful, but online, online, we have to snuff out hatred. I'm Dana Lewis. Thanks for listening. Please share our podcast and I'll talk to you again soon.