The Decentralists

Episode 4: Is Social media the Biggest Threat to Independent Journalism?

October 29, 2020 Mike Cholod, Henry Karpus & Chris Trottier Season 1 Episode 4
The Decentralists
Episode 4: Is Social media the Biggest Threat to Independent Journalism?
Show Notes Transcript

What is the biggest threat to independent journalism both in Canada and internationally? We discuss this question and more with the executive director and spokesperson for the watchdog group, FRIENDS of Canadian Broadcasting, Daniel Bernhard. 

FRIENDS conducts leading-edge policy and opinion research on issues affecting Canadian media and related issues. This research demonstrates that millions of citizens care deeply about the future of Canadian media, journalism, and programming. Unfortunately, as social media continues to dominate how Canadians access news, independent journalism is threatened. In this episode, we ask Daniel:

How are data and journalism being weaponized by bad actors? Why isn’t social media doing more to fact check the news? Should we rely on governments to “fix” social media?

We answer these questions and more on the best Decentralists episode yet!

Henry Karpus: Hey everyone, it's Henry, Mike, and Chris of the Decentralists. And we're pretty excited today because we have a special guest, none other than Daniel Bernhard, the Executive Director and spokesperson for Friends of Canadian Broadcasting. Friends are dedicated to advancing Canada's rich culture and the healthy democracy it sustains. Daniel has spent his career working to advance the public interest through a Toronto based firm he built to meet the strategic operational and software needs of charities, government, and social purpose businesses. Daniel's a first-generation Canadian; his parents immigrated to Canada from Chile in the 1970s when the country was ruled by a military dictatorship. So, needless to say, protecting democracy is absolutely fundamental to Daniel. Daniel, welcome to the Decentralists.

Daniel Bernhard: Hey, thank you for having me.

Henry Karpus: So, for those who haven't, or are unaware, could you please explain what is the Friends of Canadian Broadcasting all about?

Daniel Bernhard: Yes, absolutely. So, Friends is a citizen's movement. We are not affiliated with any broadcaster, or political party. We take no money from any corporation, or foundation, or government. We are a group of active citizens, who believes that if Canada is to maintain its independence and its ability to go its own way and have its own values reflected in society, economy, democracy, that we need to have our own conversation. And so that means a strong and independent public and private media and ensuring that there is adequate shelf space for Canada, on-air and online so that we can have our own conversation and make our own decisions, and be a truly independent country.

Henry Karpus: Yes, and it's all about culture too, right?

Daniel Bernhard: Well, yes. I mean, culture is a word that I think in English Canada is sort of misused. People think that when you talk about culture, you mean the ballet or something like that. And that's not what we mean. We mean sensibilities and values and practices and norms and the type of things that, we expect in each other. And, if you saw somebody bring a gun to a library in Canada, you would call the cops and US that would be in many places, just a normal part of life, that's culture. I often tell people if you want to know what culture is, go to a business meeting, and instead of shaking hands with someone, give them a kiss and you'll see, what culture means right there. So culture is a set of values and expectations. And in Canada we have certain cultural, history around things like decency and inclusivity, justice, solidarity, and those are, very important. They keep our society together and peaceful, and we need to make sure that we protect the institutions that protect those norms.

Henry Karpus: That's fascinating.

Michael Cholod: If I was to say, jump in here and say, what you just described sounds like a very Canadian organization.

Daniel Bernhard: Yes. It's like Tim Horton's and then us, we are right up there.

Michael Cholod: As long as we don't talk about hockey, then nobody gets all aggravated.

Daniel Bernhard: That's, right. Yes.

Henry Karpus: Okay, so what is the biggest threat to independent journalism in Canada and the rest of the world? Do you think?

Daniel Bernhard: Well, I think, it's a kind of perfect storm of several different factors. And, you know, the reliable media have been under a lot of pressure for a long time. They've been under pressure first, you know, before the internet from corporate ownership and consolidation and cuts by, you know, private equity funds that bought up newspapers and just try to squeeze them for every last dollar. Then, you know, it was the global, you know, satellite news complex that was forcing out smaller local voices, which are an important part of the mix. And now we've got, you know, Google and Facebook basically, which just dominate the advertising market and are leaving everybody else in the dust. I think the important thing for your listeners to know, and I'm sure your listeners already know this, because they are your listeners, is that, Google and Facebook's dominance in the ad market and by extension of all the information ecosystem of the world, is not natural. It's the result of a bunch of unfair advantages that governments have given them, whether it's the ability to operate without paying taxes, the ability to make a huge amount of money off of content that other people provide without having to pay them for it. The ability to... [Cross-talking 00:04:22].

Daniel Bernhard: User privacy for an added targeting efficiency. Like there's a whole bunch of them. And so these artificial un-natural advantages have allowed these two companies basically to swallow the world. And they're not exactly the most responsible companies. And so we're starting to see the consequences of what happens when you allow legitimate, reliable media to be displaced by a company that just has what they call content, that they don't really care what it is. They don't discriminate between a cat video, a news article, propaganda, racism, violence, and saying hi to your aunt. It's all just content. And, that has consequences. And we're seeing them play out, around the world now, but especially in the United States where it started and where it's most advanced.

Michael Cholod: Right. So, Daniel, I want to drill in on something there. You know, you mentioned, I remember when we were preparing for this podcast, you talked about this idea of everything being called content. And that was really interesting to me. So, maybe you can just kind of explain why that is so important, you know, that social media calls everything content?

Daniel Bernhard: Well, I think that, the term is really revealing. I mean, obviously there's a practical value. You need to describe what appears on social media.

Michael Cholod: Sure.

Daniel Bernhard: Content, seems like a good enough word. But it's really revealing in that, it is completely indiscriminate. They don't actually care what it is. Like you never called it a content paper or, you know.

Michael Cholod: Right.

Daniel Bernhard: It was always a newspaper. Or you know, when you went to a movie, you would specify whether it was, you know, a comedy, or a horror movie, or a children's movie, or whatever, there are genres is and the ratings, and things like that.

Michael Cholod: Correct.

Daniel Bernhard: And what Facebook and YouTube have is they don't actually care. They don't care what you watch. All they care is that you watch. And so they will put anything in front of you that gets you to watch longer and they don't actually care what it is. They have no preference for what it is, as long as it retains your attention. And the problem with that is, that it just so happens, that what happens to retain our attention the most is salacious, scandalous, aggressive, sensationalist, et cetera.

Michael Cholod: Right.

Daniel Bernhard: And that's a result of human evolution, it's hardwired, into our bodies. We can get into that if you want. But that happens to be what retains our attention. And so that's, then what we're seeing more and more of. And, as I said, they're crowding out the people who actually do care what you look at and put some time into it. And, this is exactly what we're seeing as the consequence

Henry Karpus: And Daniel, it's obvious, the reason they want us to stay watching, or reading longer is because they can then target more ads? Right?

Daniel Bernhard: Yes, it's too, it's, it's a double whammy, right? So the longer you watch, the more ads you see, that's obviously part of it, but also the longer you watch, the more data you disclose about what type of ads you might be more likely to click on.

Michael Cholod: Very important point.

Daniel Bernhard: So they want you to do both. And the important thing is the law is basically everything that happens when you are watching, is used against you. To target, ads at you, and then to target videos at you, which just exists, to target ads at you. So, I mean the advertising motive in, news and television has always been a little bit controversial, but networks have been able with a decent amount of credibility to say, look, we're providing good value here. And the advertising allows us to provide it for free; Facebook and Google can make no such excuse because they don't actually care what it is you're watching. They just care that your eyeballs are stuck to the screen.

Chris Trottier: This is textbook surveillance capitalism, what you just described.

Daniel Bernhard: Yes, it is. It's a key part of that. For sure it is. Yes. It's just a mechanism for getting you to disclose your interests so that they can target ads at you. And that's why the term content is so generic and meaningless, because like I said, they don't actually care. It makes no difference to them. As long as you are engaging and disclosing data, that's all that matters. And so whatever will keep you doing it, that's what they'll show you.

Michael Cholod: So, Daniel, then let's, take this to the next level. Okay? So just yesterday the US, I think it was the DOJ, or I think it was, a Senate subcommittee or, somebody charged Google with antitrust. Okay. And so this relates, and this is literally, I think it's, it's 19 days after they announced they were going to put a billion dollars into a kitty, to pay legitimate journalism outfits for their content on a new Google news kind of service that they're going to do. So, I mean, what, in the essence like, firstly, I guess my first question, is do you consider what they do in terms of, in what they do to journalism as anti-competitive behavior? Right? And in the second part of that question is, if they're going to make the money and they want to pay it back to journalism, is that an okay fix for this problem?

Daniel Bernhard: Well, I mean to your first question, it is definitely obviously anti-competitive behavior. And I'll tell you why. I mean, this is not even…it's not even remotely invalid. I mean, what they basically tell people, is give us all of your content for free, or we will disappear you from the internet. That's what they're telling them, right.

Michael Cholod: Happened to us.

Daniel Bernhard: Yes, exactly. And so if you are a digital business and you're trying to do business on the internet and you basically have two ways of reaching people, Google and Facebook, then disappearing me from the internet is kind of a non-starter. And so you do it right?

Michael Cholod: Right.

Daniel Bernhard: It's an offer you can't refuse as, the godfather would say. So that is not a choice. And people go along with that because they have nowhere else to go. And hence monopoly, it gets even worse though. When, it's bad enough for, someone who makes, I don't know snowmobiles, or whatever, but it's even worse for companies that are in the business of selling, advertising against attention, which other broadcasters and newspapers are, because in that case, Google's not just a way for them to discover customers, but it's also a direct competitor.

Michael Cholod: Right.

Henry Karpus: Oh yes.

Daniel Bernhard: And so that's even worse. So it's a bad deal for anybody. But it's a really bad deal for journalists, entertainment companies and anybody else who's producing information and entertainment and selling advertising against it. As, for your second question, should they pay? Well, I think that paying is one way around this. You know, if at the very least, you knew that, okay, you had nowhere else to go, but you got paid a fair fee and you were sort of in on it a little bit, or were able to benefit from it, in tandem with them. That would definitely offset the cost of the problem. I think the issue is that Google has decided that it will give us a nice big round number, which is one billion, for the entire planet earth, by the way. Not just for Canada, but for everybody. And I mean, I think they make that money in like, you know a day or something like that. And they will do that of their own volition and they will choose who benefits and who doesn't and who's activated and who is not. And, that's not really a good deal. So if there were a system in place where the government said, all right, these companies have to pay a fair fee to content producers. And, that's the trade-off that can, you know, sort of compensate for the monopoly position. We're going to set the price, or we're going to force them to negotiate a fair price, and then we're going to enforce it. And it's going to apply to everybody. That would be something that I would consider looking at in our organization is actually championed, you know, something like that. But, them just saying here's how much we feel like paying and we'll pay it to whoever we want and we can stop paying whenever we want, that doesn't really do it for me.

Michael Cholod: So, let me ask you this then Daniel. So, you know, if you look at the way that these services and these companies operate, okay, so clearly we, we all agree. I think all of us agree that they are monopolies, basically?

Daniel Bernhard: Yes.

Michael Cholod: And the problem now is that there are monopolies in basically every business, right? I mean, it's not just search. I mean, these guys, you know, they do all the maps. They do autonomous vehicles. I mean, they literally compete with everybody. But when you look at something like, you know, the Apple news service and this Google news service that they're talking about putting this money into, right? What that is in effect doing is taking journalism. Shall we say, quote, unquote, journalism into the realm of the Spotify of the world? Right? Which, when they go out and they talk about the reach, okay, look, if you're part of Google's news and we basically are going to pay for the content, just like, you know, Spotify and the music people do. They sell the benefits of the access to an audience. Right? I mean, you can offer the CBC access to not just Canadians, but the downside of it is, that then they determine what you make. And I mean, we had a conversation a while ago with a musician who says, you know, you guess how much, I get paid for a million listens of one of my songs on Spotify.

Daniel Bernhard: Yes, 82 cents and a six pack, or something.

Michael Cholod: Yes, it is like a hundred dollars.

Henry Karpus: If that.

Michael Cholod: That used to be what Henry, like a platinum record?

Henry Karpus: Oh right. You used to…you hear a million plays or buy a million records. I mean, that's enough money for years.

Michael Cholod: Right. And now you make a hundred dollars. So I guess, you know, the idea is, the internet and this model of basically a scrolling, you know, kind of adjudicated newsfeed, something that is just antithetical to the whole idea of independent journalism?

Daniel Bernhard: Yes. I mean, that's a very good question.

Michael Cholod: Can they co-exist?

Daniel Bernhard: Yes.

Michael Cholod: That is my guess, my question?

Daniel Bernhard: I wonder? I mean, I think the internet and journalism can exist together, but I think the internet has certain tendencies that need to be watched. You know, so for example, if you were in the 1850 or something, and you were in hard times and you were poor, it made sense that you might want to work seven days a week. And, you know, you might want to take your kids to work with you to have them chip in because, you know, times are hard. And that is just, a factor, you know, a pressure of capitalism and poverty, and the employer wants to do it, and they're happy to drive down wages and people will undercut each other because they need the work. And that gives way too much power to the employers. Well, we recognize this tendency in capitalism and we implemented things like minimum wage, which means you can't undercut somebody, beyond a certain point.

Michael Cholod: That's very interesting.

Daniel Bernhard: You can't take a twelve-year-old into a mine, you know, there are rules. Now that that sort of defends us against some of the worst of these pressures. So the internet has, has kind of pressures of its own. And those pressures are about monopoly because the services are informed by data. And, you know, he who has the most data wins and you cannot start up and suddenly make that up. Like, you know, if somebody else tried to launch a full alternative to Facebook that worked in exactly the same way, you know, that would be a difficult challenge because, you would be starting with Facebook circuit 2006, and, you know, people expect it to look very different now. So the use of that data is, pretty predatory crowds people out, and it's positive reinforcement. Like once you have it, you get more and you get more, and everyone else goes away. So I think what we have is, it's kind of like a really old story actually of, you know, one or two companies that are way too big because of whatever the dynamics in their industry are. And either we will impose democratic power, which says, governments, tell these companies here are the limits. Or we succumb to their private corporate power. And I don't really like that idea. So, the important thing to recognize is that these companies are totalitarian, and I don't mean that in like a Stalin kind of way. I mean it, in the sense that they seek everything. Right?

Chris Trottier: They are profit driven.

Daniel Bernhard: Well, they're more than profit driven, Mike. Amazon wants to have a store that sells you literally everything. Right?

Michael Cholod: Right.

Daniel Bernhard: And you got to admire the ambition of that along with, you know, being terrified by the hubris. But someone said like, I want to have a store in which you will buy everything, if you want. Correct? And Google has the same thing. You need to be able to find and do everything. And so that, when you make a claim to everything, you're going to kind of run up against other powers that are controlling certain parts of things. And one of those powers happens to be the elected government. And what Google and Facebook Hertz, Amazon are all trying to do is through little deals, lobbying this billion dollar pot, you know, bunch of other little actions that they're taking is to basically stay outside of the law. They are trying to do everything in their power, not to have to acknowledge, or to submit themselves to democratic authority. And we should wonder why. I mean, yes, they're greedy. That's true. But I think there's also a question to be asked about whether, or not they'd be able to continue making money at the rate that they're making money if they had to operate legally.

Michael Cholod: Right.

Daniel Bernhard: And, you know, if they're so afraid of the law, we should ask ourselves whether we should be afraid of them, because, you know, the laws, the laws are not perfect, but they, they exist for a reason. And these companies are claiming the right to operate completely outside of them. I don't think that's a good idea, even if you're a techno-optimist, even if, you know, believe in the future of the internet, you want to do all that stuff. That's, great, I'm with you, but you cannot launch a competitive alternative to Google, it is not possible. If you believe in an innovation economy and digital innovation economy then we have to deal with that. I mean, these are just facts of life and don't believe, you know, Google and Facebook's propaganda to say that any action taken against them is an action against the future. I mean, that's just ridiculous.

Michael Cholod: It is ridiculous.

Daniel Bernhard: These guys are going up against democratic governance authority and it's them versus the people. And, you know, the people need to get together. And that's what friends are doing. And, you know, I would encourage your listeners to go to friends.ca, sign up for our newsletter. We'll give you opportunities to take action and participate because at least in Canada you know, and apart, from you guys and, you know, few other people who are providing information on this, there's not really a coordinator to activism.

Michael Cholod: Correct.

Daniel Bernhard: Or movement to stand up to this and we need that. Or, we're going to get steamrolled.

Michael Cholod: Well, and I was just going to say, Daniel, you've tied this, into a really good point. Okay, is A, it needs to be our, and by our, I mean, individuals. We need to get together to try to solve this problem. And I really respect your organization and what you're doing. Because I think that this is something that is a piece of human nature that does not serve us well in circumstances like protecting journalism, or anything else from these tech giants, because I truly do not believe that the politicians have it in their best interest to stop them either.

Daniel Bernhard: Oh, yes.

Michael Cholod: You know what I mean? Like, you know, we've talked about this just a lot recently, Daniel, where, you know, the idea of, if right now what you've got with Google and Facebook and any kind of platform that has access right. To everybody's eyeballs and like literally down to, they can tell you that it's exactly Daniel Bernhard, that's viewing this. You know, the politicians then what politics becomes and this, yes; you mentioned it answering to a democratic elected elite. Democracy is under question now, because, you know, we've seen lots of exposal. It's basically been kind of proven, that Facebook has manipulated 68 democratic elections, through Cambridge Analytical over the last cut of 10 years.

Daniel Bernhard: At least.

Henry Karpus: Wait there is one coming up.

Michael Cholod: I know exactly. Oh, yes. And so you think about it, right? If, what politics becomes is rather than it being the traditional, here's my party, here's my platform. Let me explain what I believe and let me explain how I'm going to get it done, and then you choose. It becomes who has more money than the other guy. You know what I mean, because who can buy more ads than the other person. And if I've got a bigger war chest than you do, I'm winning, guaranteed. And so, you know, do you think that there's something wrong with us looking to our regulators and our governments to fix this problem?

Daniel Bernhard: No. I mean, yes and no.

Michael Cholod: I mean we need your help clearly. But, you know, I mean, do you think it's in their interest?

Daniel Bernhard: Yeah.

Michael Cholod: Right.

Daniel Bernhard: You're, very right. That politicians are kind of conflicted because this is a big manipulation machine. And I think you also hit it on the head, or maybe I heard you hit it on the head. You, said, you know, Facebook has influenced the elections. Like, I don't think that's a hundred percent true. I think that Facebook has allowed the elections to be manipulated. Like they provide the tools, but it's just the same old, you know, unethical, partisan, extremists who are doing their thing. They just have this amazing tool that they didn't have before. Right. And, so the problem is that those partisan extremists happened to elect the leaders. And so the leaders are kind of beholden to these companies. And that is that is a huge problem. And, and yes, it, conflexs them for sure. What I would say though, and the basis of friends. And, you know, if any of your listeners are our donors and are monthly donors to the friends, you would, you would, know this, or you've heard me say this, these companies do have a lot of power, but there's one thing that they can't do yet. They can't vote. And so while the companies have great power and give politicians great power to manipulate the electorate, et cetera, that someone, a human being still needs to go and cast a ballot. And if we tell the government that will not stand for this and that we consider this to be corrupt and unethical and antithetical to democracy, and basically sort of like a mini Imperial project where these big companies are coming up against governments and pretending to be equal parties. Then you know, we won't like that just like we won't like if you're in the, you know, the pocket of the bank or, some other big industry. And ultimately if they have to choose between the manipulation machine and the votes, my hope is that they'll choose the votes because that's ultimately the currency that they understand.

Henry Karpus: Right?

Michael Cholod: Daniel, I have a question here about something that Google did a couple of years ago. I am not sure if you remember this? But once upon a time Google owned an app called Google reader. Okay. And this was the primary platform that people use to collect their RSS feeds. Now Google killed Google reader and by killing Google reader they killed RSS. And I'm going to give you a little bit of a background on why they did this. They did this because they wanted to push everybody from Google reader onto Google plus. So that whole enterprise blew up in their face though, because rather than moving to Google, plus people moved to Facebook and Twitter to get their news. And that's why we have the situation we have today. How has the death of RSS and Google reader really affected independent journalism?

Daniel Bernhard: Wow, that's a great question. This is a niche. This is like, what's your take on the fourth inning of game six of the 1972 World Series. But it's a good question. I think that RSS was, you know, was pioneered by Aaron Swartz. He was a, you know, a real freedom fighter, for an open internet. And kind of martyr of the open internet really, who unfortunately died several years ago now. And RSS allows you to control what you see.

Michael Cholod: It allows you to curate what you were interested in and to have a feed that was chosen by you from sources that you've identified from anywhere on the internet, basically that wanted to publish in this really simple format, right. RSS is really simple syndication. And it gave people a lot of power over information and what they wanted to see. And that is completely the opposite of what the social media companies want to do, which is that they want to control what you see. And so when you search for something, they want to show you an answer based on what they think you're most susceptible to, like, or to stick with and all that kind of stuff. So there is a kind of emancipation paradigm that puts users in control.

Daniel Bernhard: I actually hate that word. I should take that back. It's a bad habit. You know, there's only two industries that call their customers users, or, you know, the internet companies and drug dealers.

Michael Cholod: Oh, I love that one.

Daniel Bernhard: Oh, I'm going to use that one. That puts people readers in control of what they see online and allows them to have a direct relationship with publishers and [inaudible 00:27:04] connectedness, and the paradigm where the companies push stuff at you. And what they tell you is real, is all that there is. And, what you find is what they want you to see. And so, I don't mean that in a conspiracy way, it's just, that's how they work. So, the death of RSS is as a paradigm, is hugely consequential. I think it marks a big shift in what was available to people. And it's kind of like a turning point between, you know, the idealistic early internet and the kind of fatalistic internet that we're in now, which is just, you know, where people have no control, and the companies just push at you instead of you being able to pull from this great expansive information, what you think you find interesting, not what they think you should be interested in.

Henry Karpus: Right. So, that is great insight.

Michael Cholod: So Daniel, let me ask you a question here. So, on this point, okay. We talk about RSS and the ability for people to curate their newsfeeds. And I think that is a point that, you know, literally should be screened from the mountaintops, because I think that's something that, you know, with the death of RSS and that kind of thing, I'm sure that, there's probably nobody over the age, of kind of 20 that knows what our SSC is, or under the age of 20 type of thing, you know? So the question I have is, as you look at, so what we've got now is this reality where you have to scream loudest to be heard, right? Because that's what we talked about, sensationalism stuff like this.

Daniel Bernhard: Yes.

Michael Cholod: So, you know, what does, this whole idea of content and all this stuff, right? And you've got these, people masquerading as legitimate journalists. What does treating legitimate journalism, the same as something like QAnon, right? What does that do to reduce the value of professionalism?

Daniel Bernhard: Well, it calls it into question, doesn't it? I mean, it basically says these guys are no different to one another over to you, right? I mean, I read this article this morning and I try not to complain about the media, but every now and then I still do have a bruise on my head from smacking it repeatedly in the wall. Which talked about, it was Dr. Theresa Tam, Canada's Chief Medical officer of health being asked about misinformation COVID-19 misinformation on Facebook. And the whole article was about like, what you need to do as an individual to be smarter. Right? To check the sources before you share and you know, that, that kind of thing. Yes. And I mean, to track this stuff up to individual responsibility is just insane, right? I mean, that's never been expected of anybody. I mean, we didn't get people to quit smoking that way. We had to put the stuff behind barriers and taxes and disincentized it, and put, you know, your future corpse on the cover. And, you know, a whole bunch of other things. If we just said, oh, you know, up to you, smoking is your problem. Then we wouldn't get anywhere. So the individual willpower thing makes me nuts. But if we are basically going to say that they have no responsibility and that everything is equal and it's up to you, then the result will be very predictable. We're going to have people who believe it and people who don't. And those people will continue to live in alternate realities because there's absolutely nothing common across those things. It's not like partisanship was invented by Facebook, but it was aggravated by Facebook in a way that's never happened before because mass media was at least a like it, or hate it trusted, or not was at least a sort of shared experience that people got and they could respond from there. So, this idea that, you know, the CBC or the, [inaudible 00:31:03] and uncle bill are basically just, you know, equally credible. It’s just nuts.

Michael Cholod: Right.

Daniel Bernhard:Quite frankly, it's just nuts. And we can't, just let that continue. The effects are really predictable, and they're playing out all around us. So, yes, I think that the effect on journalism is that journalists had a trusted status, even if he didn't like them; they had a certain trusted status because they have a professional code, most of them, but they're also legally liable for what they publish. Like if you know, the journalist published a letter on the front page, calling, you know, calling you an ax murderer, or something like that you can sue them and you'll win. And so that's like a natural check to try and keep them to tell the truth. And so they run things by lawyers, they try and verify that things are accurate. They have certain incentives to make sure that they don't go and just trample on people. They can't just do that. Willy Nilly, Facebook, just circumvents that whole process. And for them to just say that it's up to you is really unfortunate, but that's exactly what they're doing. Look, what Facebook's response is to fake news and propaganda and all this kind of stuff. They invest in all of these media literacy activities, you know. Like they have a partnership with the Canadian journalism foundation, which I think is really regrettable. They got, you know, Peter Mansbridge to come on TV and talk about how you should check before you share and stop, you know, look before you act and stuff like that. And again, it's just personal responsibility, which is exactly what the tobacco companies did and yes, that killed people. And this does too. I mean, you know, look at the guy who shot those two people in Kenosha, Wisconsin. He was responding to a Facebook event. That was not one that he looked up, it was recommended to him. It was pushed to him. And, he showed up and he killed two people. So, you know, if we're, wondering about, what the consequences of this are, we don't have to look very far. It's not a far-flung, you know, conspiracy this, information is quite literally killing people. And with, with COVID-19, it's even more than that, how this company can claim that such power over our minds, kinds [inaudible 00:33:28] no responsibility, or reliability, it just beggars belief, but what's even more amazing is not that the companies claim to be exempt from the standards of law and decency, but that the government seemed to believe them and accept the case. And this is something that I think Canadians and, you know, decent people around the world should be outraged around. And, you know, I would again, encourage you to come and come and join us, because that's what we're doing at Friends. We're trying to stand up for people's ability to be decent and to say, I don't want to live in this society that does this. And if you want my vote, you're going to have to show me that, you understand.

Chris Trottier: Yes, and to echo your point, Daniel, Facebook's own studies, say that people are radicalized after Facebook recommends to them a radical group hosted by Facebook.

Daniel Bernhard: Yes. They said, if I remember the numbers that wall street journal reported this the summer that Facebook's own research says that 64%, I think it was, of the joins of extremist pages including white nationalists extremist stuff, you know, everything considered extremist 64% of the joints of those pages, they believe were attributable to their own recommendations. In other words, Kenosha guy showing up, you know getting part of this group and showing up to this event and killing two people, there's about a two thirds chance that he would not have been there, had Facebook not recommended it to him. Right? If QAnon has, you know, three to 5 million followers, that means there's like a 2, to 3 million of those people would not have been there and would not be radicalized, had Facebook not recommended that they get involved. I mean, that's crazy. And this is one of the big differences between Facebook circuit, 2006 and what we have now. If your feed was just a chronological list of everything that your friends posted, that's one thing.

 Henry Karpus: Yeah.

Daniel Bernhard: But it's not, it's curated, it's recommended. And what you see is not in your control, what you don't see, is not available for you to scrutinize. And so they are recommending things to you. They're actively encouraging you to participate, to share, to become involved in oops, a couple people died, Mark Zuckerberg called that an operational mistake. So double murder.

 Henry Karpus: Wow.

Daniel Bernhard: And, he called it an operational mistake.

Chris Trottier: Should they be accountable? Like, should that be something that they get taken to court for?

Daniel Bernhard: I think so. I mean, if you look at that group, you know, Buzzfeed has done some very good reporting on this. And actually, Buzzfeed's main reporter on Facebook and misinformation, [inaudible 00:36:16] is a guy from Toronto called, Craig Silverman. And he actually still works here. And, he's a fantastic journalist and you know what he has shown, they went into that group. They found that group and that group had posts saying things like, I totally plan to kill looters tonight. I'm going to drop them as soon as they come out of the stores. Like that kind of stuff. I mean, this is not a coded language. It's pretty straightforward, you know. Facebook says that they can understand in a live video happening in real time. If the person in a video has sort of suicidal ideations, they say they didn't make this claim, but they can understand when someone writes in plain English, I'm going to kill people. I mean, how is that, how is that even remotely believable? So like either they're lying to advertisers about what they actually know about us. In which case it's the largest fraud perpetrated in all of human history. Or they absolutely do know all these things about us, and they somehow don't believe that I'm going to kill people, as something that they should report to law enforcement. And you just cannot have it both ways. You cannot have it both ways. And yeah, I think that if we believe their narrative, they understand what's going on and they are fully culpable. Friends’ has actually published some research on this. Just last month, if you go to Friends.ca you'll see on the homepage an article about platform liability. You should check it out.

Michael Cholod: Saw that.

Daniel Bernhard: If you want to see the whole report, it's friends.ca/platformforharm, all one word, platform for harm. And you can read the analysis. And the analysis basically says, if you know about it and you promote it anyways, you are liable under existing Canadian law. The real question is why the cops haven't shown up yet because, the case seems pretty clear cut.

Chris Trottier: Now somebody you know, who's not necessarily following these stories that, you know, they might say, well, why doesn't a blue check on somebody's Facebook or Twitter profile? Why does that not make them more credible?

Henry Karpus: And what's a blue check, Chris?

Chris Trottier: A blue check is something that in the past was used to verify a profile, but over time has, has kind of given certain users. I wouldn't say authority, but kind of a sheen of authority.

Daniel Bernhard: Yes. I mean, I guess, it depends on if you think that the biggest problem on Facebook, or Twitter is identity theft, then you know, maybe the blue check helps. But me knowing that it's actually Donald Trump saying all that ridiculous stuff doesn't make him any more believable. It just means that I just know that it is him.

Henry Karpus: Doesn’t make it any better.

Chris Trottier: Exactly.

Daniel Bernhard: So, yes, it's, like I said earlier about partisanship not having been invented by these guys, but just, you know, this being the perfect tool for those kinds of hacks, it's the same thing. Like the people who are doing this have been doing it for a long time. We just do not have the means to do it. So, yes, Donald Trump's statements as a perfect example. I mean, that's almost too easy, but it's an extreme example that proves the general point, just because we know it's him, it doesn't mean that anything he's saying is true or credible, or legal, or safe.

Chris Trottier: Correct.

Daniel Bernhard: So yes, I don't think identity theft is the biggest problem. I mean, yeah, there are bots and there's, you know, Joe Anderson three, nine, four, six underscore two, four 12. And, obviously those people aren't real and that's, that's an issue, but often they're actually amplifying what a real verifiable human being has said. It just so happens that what they've said is awful, and potentially wrong, and sometimes illegal.

Chris Trottier: So Daniel, that leads to the next obvious question, all about content moderation, who should do it, and can it be owned by anyone? It certainly was with newspapers in the past, or what's your view on that?

Daniel Bernhard: Yes, I think that's a great question. Well, I think it comes down to recommendation if, Facebook and Google want to do the algorithmic curation thing and they want to make choices about what you see, because it's in their commercial interests, of course, then they should be responsible for those choices. And that means that they have to comply with local laws. And as far as I'm concerned, the content moderation standards should not be dictated by a government, or by a minister, or by a bureaucrat, but they should be set by a judge. We already have rules around, you know, like, you know, you can't go on national TV and call me a paedophile for example, because it's not true, right? Which I assure you, it is not.

Michael Cholod: Right.

Daniel Bernhard: Laws should be the same. And so the companies then should have to do whatever they feel like doing to make sure that they're in compliance with the law. And if they're not, they should get hit with penalties. And if their business model, you know, means that they get hit with penalty after penalty, that's their problem. If they instead wanted to just be a passive message board that just said whatever then that would be a little bit different. However, the law in Canada, and you'll see this, if you come to our report, the law in Canada actually around publisher's responsibility is super interesting. It actually comes from a case. I think it was like in the 18 hundreds from the UK, where someone posted on a notice board of a golf club, something defamatory about somebody.

Michael Cholod: Okay.

Henry Karpus: Really?

Daniel Bernhard: Yes. And the guy who was the subject of this defamation complained to the golf course and said, you know, this is not true. Can you take it down? And they didn't take it down. And this went to court and it basically sets the standard for today, which is that if you know that something is untrue, or have good reason to believe that it could be untrue and you're notified about it, after you're notified about it, you become responsible for it. And so, they could reasonably say, look, we didn't know, we didn't put it up and that's all fine, but now I told you, and so you have to take it down. And so even if it's just a chronological post, even if they're making no editorial decisions, then they are still responsible under Canadian law if they're notified. But now it's even worse because they are making editorial decisions. They are making choices and they're making choices based on what's in the content. And so they're doing more than just publishing. They're recommending, you know, [cross-talking 00:46:37].

Daniel Bernhard: They're pushing. You know, when people saw the Christ church video on the Christchurch mosque murder video on Facebook, it wasn't because they went to look for it.

Chris Trottier: Right.

Henry Karpus: Exactly.

Michael Cholod: Yes.

Daniel Bernhard: And so this is it for them to say, wow, we're just making information available and whatever. Well, that's one thing, but they're not doing that. They're recommending it to people. And at that point, I think the responsibility is just like, it's a no brainer, because they know what it is in advance. And they say, you know what, I think Mike would like this, or I think so-and-so would like this.

Michael Cholod: Correct.

Daniel Bernhard: Which means they know. And, I just, one more point, when Facebook and Zuckerberg goes to Congress, he has said under oath on multiple occasions, like we find 90% of the ISIS content before a human user ever sees it. And we take it down,

Michael Cholod: There's no way.

Daniel Bernhard: Well, who knows? I mean, I can't verify it, but who knows.

Michael Cholod: Right.

Daniel Bernhard: But let's just choose to believe him, which may not be a great idea. But let's just choose to believe him, because he was speaking under oath. If that's the case, then let's follow the logic. What happens, when you click a post, what are you saying? When you click a post, actually no, that does not become available right away. We check it first. [Cross-talking 00:43:50].

Daniel Bernhard: To see if it meets certain standards. And if it does meet certain standards, you know, if it doesn't, you know, fail any of our tests, then we publish it. Now that check may happen in a quarter of a second, but it happens. And if they admit to that, then it means that in theory, well, not in theory by their admission, they are technically approving everything that appears on the platform.

Henry Karpus: Right, wow.

Chris Trottier: Yes, Daniel, I just want to say since you brought up the whole Christ Church mosque shootings, the way I found out about that was I one day logged on to Facebook and that's, that's what I saw. I saw the live shootings on my Facebook feed.

Henry Karpus: Oh my God. No way.

Chris Trottier: You know, nobody in the media reported it to me. It was just somebody who put it there. I saw what was going on and after that I'll be honest it traumatized me. And I remember telling Mike about this, and this was the catalyst for me finally deleting my Facebook account.

Henry Karpus: Wow, did you even realize what you were seeing Chris at first? You couldn't have?

Chris Trottier: Well, at first I thought it was just from like a movie or something like that. But as I kept seeing this, it just seems so surreal because I couldn't believe anybody would record this, live. And I couldn't believe that Facebook would expose it to me. You know, thank goodness, it was me and not my daughter who saw it. You know, and basically that's when I realized, all this stuff about, about them blocking out terrorist content its pure propaganda.

Daniel Bernhard: Or, even if they do 90%, what's the other 10%, I mean, but, but you're right. You did not ask to see that. And what you got was a guy murdering 50 people on live video. And not only did you get a guy murdering 50 people on live video, you got a guy who in, in many respects did this so that it would trend on Facebook.

Michael Cholod: Exactly.

Daniel Bernhard: Like it was the UK government has been very clear about this, but of the last, you know, several major terrorist attacks in the UK, almost all of them have had a deliberate social media component. In that, they seem to be designed to trend. So, what did the Christchurch guy do? The Christchurch guy mounted the camera on the gun, in a way to make it look like he was in one of those first person shooter video games. And there's a whole YouTube kind of sub community around these types of things, with comments on them and whatever. And this guy did a real life version of that as a kind of wink joke to a community of people on YouTube. And, then streamed it live and had all the notoriety that emerged from that. So, you got to ask yourself, like, if they're looking for notoriety there, they are designing the attacks to do this. And is that a factor and even motivating them to consider doing it? I mean, you know, I don't think that Facebook is responsible for all, all ethnic hate, but they're definitely providing a sweetener to people who want to, make them themselves known, because they do a very good job with that.

Henry Karpus: So Daniel, how do we fix this?

Michael Cholod: Yes, I was just going to ask. Yes, because I think Daniel, that what we've got here right. We've had this unbelievably great discussion, open my eyes a lot, right. Because I have to admit that I default towards, having people exercising control themselves. Okay, taking some responsibility because, you know, when we go through this discussion, we've talked about politics. Right? And we've talked about, do the politicians fix it? I mean, Henry in the introduction talked about, you know, your parents leaving Chile, because of an autocratic regime right? Where they controlled the message. Right? I mean, that's the thing. So you don't, ideally want the government controlling it, right? And you basically don't want a corporate interest controlling it. And, you know, the majority of people will just absolve themselves of the responsibility for doing it. So, how do we fix it? Like, how do we, kind of get to a point where, you know, enlightened dialogue and you know, kind of legitimate viewpoints that are espoused by journalists and backed up by, kind of legal alternatives, right? How do we get back to something like that in an age where everybody expects everything twenty-four-seven at, their fingertips?

Daniel Bernhard: Yes. I mean, I would say that we're not, you know, we have to have reasonable expectations. We're not going to clean up the internet a hundred percent. We're not going to get rid of bad people. The Internet's like a city, it has, you know, has some dangerous and unsavory parts to it and you know, people want to go there. It's hard to stop them from going there, but t you know, we try and keep those parts kind of limited. And I think the first thing to do is to just say, who makes money off it right? Let's look at who is making money off it?

Michael Cholod: [Inaudible 00:49:16].

Daniel Bernhard: And start there and try to make it less profitable to do this kind of stuff. So if every time the Chrysler video showed up on someone's feed, you know, Mark Zuckerberg went to jail and somebody had to pay a hundred million dollars. That would be a good reason to stop doing it, or to figure out a way to stop doing it. Right? And so, yes, I think that the biggest tragedy of all this, is that in the country like Canada, a lot of this stuff is already illegal, like broadcasting, a mass murder in public is already a crime. There's no doubt in my mind that that is a crime. And if you want to just think about it, imagine what happened if you, instead of seeing that video, when you turned on Facebook, imagine if you saw it, when you turned on CTV, like, just think about that. And we all know what the answer is, right? People would be in jail, they'd lose their license. It would be a really bad scene. So, therefore they don't do it. So what I would say is who makes money off this and let's start imposing penalties that are commensurate with the size and profitability of the companies that handle them. I mean, Facebook in Germany now faces some pretty harsh rules. If there is identifiable hate speech that is left up on the platform for more than 24 hours after they're notified, they face fines of up to 50 million euros for each infraction.

 Henry Karpus: Wow.

Daniel Bernhard: And so Germany has 5% of the world's Facebook users, I believe. And they now have over 15% of the content moderators.

Michael Cholod: Right.

Daniel Bernhard: Because it's just not worth it.

 Henry Karpus: That's interesting.

Daniel Bernhard: It's not worth it. So they're, there, there, you're speaking the language these companies understand. And I would say that that's the first place that we can start. Like, I mean, to your question, Mike, is this going to resuscitate, you know, legitimate journalism on its own? No, but what it will do is put a little bit of a damper on the forces that are driving it out of business, by making them known. Hey, Facebook was charged with violating the law again. Now, that makes it less savory for people to participate in. And we'll hopefully incentivize alternatives that can operate legally and therefore more profitably. Ultimately what we're talking about, all of these exceptions from the law, these are like artificial competitive advantages.

Michael Cholod: Correct.

Daniel Bernhard: And we need to end them, we need to end them. And then we'll see where the marketplace really lands. You know all these people who were like a free market, free society, whatever I'm like, how is this free man?

Michael Cholod: Totally.

Daniel Bernhard: You know, like one side the ledger has to do all of these things, which are expensive. And the other side has none of them, let's have a fair fight and see how it plays out.

 Henry Karpus: And in that world Facebook's scale, which currently is its biggest commercial advantage, I think becomes a big disadvantage.

Michael Cholod: That's what I was just, going to say.

Daniel Bernhard: It becomes ungovernable. And then it's on them to figure out, you know, honestly I do believe in private enterprise, you know, and, and I do believe that the government shouldn't be telling people what to say, but there are laws, there are judges we've already dealt with this for a long time in the past. And when there are infractions, there should be consequences. And then let them figure out how they can operate a compliant business. That's not for me to say. But when they break the rules, the penalties should hurt. And, I think that's where we have to start. And that's what platform for harm our research paper proposes and, you know, I'd say again, go to france.ca/platformforharm. You can read the whole analysis while your friends at CA you can sign up for our newsletter because, you know, these consequences are only going to come, when people tell politicians, this is unacceptable. I mean, you talked about your daughter being able to see this video. I mean, imagine if, you know, in order to go to school, you know, your daughter had to walk through like a chicken fight and, you know, fight club and a meth dance.

Michael Cholod: Right.

Daniel Bernhard: And you know, all of this stuff just to get to somewhere legitimate. You would be screaming that it's unsafe and that we need to protect our kids.

 Henry Karpus: Yes.

Daniel Bernhard: And this is the same thing. Like if, the price of connecting with your friends and being online is that you have to wait through all of this crap. Like, is that a reasonable expectation for a 12 year old? I mean, is it really, I don't think it is. So we need to make sure that our laws and standards are applied and that violations are punished, you know, in a way that is proportionate to the harm and to the size and profitability of the company that's doing it. And that I think will send the right message that we will not tolerate this. Our first principle has to be, this is unacceptable. I mean, we have not heard that yet.

Michael Cholod: Right.

Daniel Bernhard: We hear a bunch of shoulder shrugging and shocks and that's too bad and it wouldn't be great if they could be better? How though, this is unacceptable. Wouldn't that be nice?

Michael Cholod: Right? Well, and if you think about it, Daniel, I mean, I make the analogy to people. I say, you know, what you got to think about with social media, right. And this is kind of you know, the kind of attack that we take on this thing is that it's not; they should almost have to drop the word social. You know, to me, one of the first places to start is to basically make them have to say Facebook broadcasting network, because realistically it's just television in a different form. Right? Except television to the point we've discussed, you know in this podcast, so far television without controls, right? But it's not, it's a media quote, unquote, that's created by people. So, they say it's social. Right? And so, you know, part of the challenge, is nobody on this call, on this podcast, nobody listening to this podcast needs to sign up to some network of 2 billion people to communicate with their grandmother, you know, things like this. And, I kind of truly feel that part of the challenge that these guys have, one of the things that's got everybody to this point is they started using these platforms as a method of connecting with people. Like that's what Facebook used to say, right. Connecting people.

 Henry Karpus: And it was a legitimate idea.

Michael Cholod: You know, it was a legitimate idea where essentially it was just a more media friendly kind of version of like one of these flicker or photo sharing sites used to be right, like Picasa and stuff. And, you know, the idea is now when all the sudden people started using it to say, not just share their photo album, but to communicate with their friends and family, then they built their networks essentially on a television set. Right? You're not going to buy an ad at the super bowl, you know, to tell your wife to bring home milk. And I think, you know, and so I kind of feel like [inaudible 00:56:06].

 Henry Karpus: That would be awesome, by the way.

Michael Cholod: Could you imagine, Like, I think that's a dynamic. I think that people don't understand. Right? I think one of the things is that so many people they talk about. I left Facebook and I regretted it because I lost my connection to all my friends and stuff. That, to me, is the problem looking at something like a television network, which is, I think, what these things are. Where the content is created by not just your friends and family, but your friends’ friends. Right? Which is the danger. Right? That's how it cascades, because I don't have any friends that would send me right, wing propaganda. Okay.

Henry Karpus: Yes, exactly. But then they might, people anger share, right? They do that a lot. They say like, look at this stuff. I mean, that's a real thing just anyways.

Chris Trottier: It is.

Henry Karpus: I know that's not your point. I apologize.

Michael Cholod: All I'm getting at is that, I think that people, anger share, which is true. But they anger share stuff that they don't even really look at. You know what I mean? They just get something from somebody. They don't even know who that person is and they press a button and say, there you go. Get that. Okay. And so I, you know, to me, I think what we need is a new dynamic where people, you know, you kind of have to start again. You know, start building your network again, start building connections with real people again.

Henry Karpus: Yes.

Michael Cholod: You know, and if you choose at some point to expose yourself to a wider, you know, kind of net or a newsfeed, you can do it at your choice. And if you don't like it anymore, you can turn it off.

Henry Karpus: Yes, your control.

Chris Trottier: Yes.

Michael Cholod: To me, that's the dynamic. That is the thing that is the nut for me in this whole issue is people no longer have, or feel that they have any control over what they say, see, or do online.

 Henry Karpus: And Mike, this is why peer social, many, one is going to completely change the dynamic.

Michael Cholod: Well, we just want to change the dynamic. Doesn't make sense that you should be the arbiter of what you see when you first open your newsfeed in the morning, rather than it being somebody, shooting people.

 Henry Karpus: And you control who becomes a friend and have to allow that to happen.

Michael Cholod: Right.

Henry Karpus: You're not forced to choose anyone.

Daniel Bernhard: Yes, I mean, I think you're, you're entirely, entirely, right. Like anything that people can do to get to break this paradigm, you know, to connect in other ways is really helpful. But I will come back to the example of the cigarette. And I think that you know individual will power is necessary, but it's not enough. And it's not enough on a collective level. These products are designed to be addictive. I mean, if you guys watch the documentary, the social dilemma that came out recently, I mean, this is a documentary in which the people who created the main social media, basically from Gmail to YouTube, to Pinterest, Facebook. That the engineers who really, and designers who really pioneered this stuff, all expressing their regrets. And the number one thing that they point to is that it is addictive is that you don't really have control. I mean, when your phone starts giving, like my phone, I'm looking at it right now. Okay. It's, it's awesome. Don't worry. But I see a little, I see a little green notification pulsing just like that, right? There's like a whole team of behavioral psychologists who have studied what shade of green, how fast the pulsing.

Henry Karpus: Is that right?

Daniel Bernhard: Try and make me go and do this. And that's just built into my evolution. I'm told now that someone wants me there, something that needs me. That's something that's really hard for us to overcome. And so I think that we need to do what we can and we need to create alternatives for people, but ultimately we also need to get the cigarette vending machine out of the preschool cafeteria. I mean, and that's just not a line, that's not my line. And, you know, Jim Balsley, the founder of Research and Motion, and Blackberry. And now, the founder of the center for digital rights in Canada, that's him, who said that I have to give Jim the full credit, but he's right. We need to get the cigarette machine out of the primary school cafeteria. There is, just, these things are designed to be addictive and they work. And we need to act with that in mind because we're talking about kids getting addicted to stuff. We're talking about us getting addicted to stuff. Like ask yourself guys, I mean, if I took your phone away for two hours, do you think you would be totally cool with it? You wouldn't sweat, or have a little bit of a nervous thing.

Michael Cholod: You, kidding me. I can't even, you know, not have my hands on it when I wake up.

Daniel Bernhard: Exactly. So, you know, I'm no better. So you know, we're all addicted to this and we're addicted to design and we're addicted because someone makes money from us being addicted, right? And so we can try our best and we can refrain and we do, and some people make good progress, but it's not enough. It wasn't enough with tobacco. And, and it won't be enough with this. We, we need, we need something more than that.

Michael Cholod: Awesome. Daniel, this has been you know, a fantastic, fascinating discussion. I want to personally thank you for, you know, for basically fighting the good fight. I want to thank you for the best artwork I have on my walls, the wanted posters. I think that's a great idea. I also want to point out that, you know, I got my first one, a little you know, eight and a half by 11 by donating to forensic Canadian broadcasting. And that's not, to kind of toot my horn. I'm trying to get people out there to say, you know, you can get yourself a cool Mark Zuckerberg wanted poster, you know, type of thing. And so I really want to thank you for this discussion. I mean, I think we could go on, for the, for another two hours on this thing. So your, you know, your day-to-day life is fantastic. So I just, really want to thank you for this. I think our listeners are going to have an unbelievable episode here. And you know, I think we need to plan part two at some point.

 Henry Karpus: You know, Daniel was fantastic. This wasn't just interesting. It was riveting. So really thank you so much.

Daniel Bernhard: Well, thanks to you for having me and also for doing the show. I mean, creating, you know, forums for this type of discussion is really important and creating an audience of people who've come to understand these things is really important because the powers that be, I'm not trying to be a conspiracy guy here, but you know, the bulk of power money, basically, you know, it doesn't, have an interest in this conversation happening. So the type of regulatory measures that we're talking about, the type of individual willpower that we're talking about, the type of attitude in which voters come to understand that when a politician knocks on the door, they say, what's up with this stuff. What are you going to do about this? You know, your podcast is a really important ingredient in that. So you guys are really making a contribution, and it's been a pleasure to discuss this with you. Thanks for having me.