Insights@Questrom Podcast

SCOTUS: Exploring the Future of Content Moderation, Internet Law, and Free Speech

Boston University Questrom School of Business Season 2 Episode 6

Unlock the mysteries of online speech regulation with Kabrina Chang and Marshall Van Alstyne as they dissect the debates at the heart of Moody v. NetChoice and NetChoice v. Paxton. This conversation cuts through the legal jargon, promising to enlighten you on the intricacies of Section 230 and its pivotal role in the growth of social networks. Delve into the very fabric of digital communication law, as we peel back the layers of content moderation and explore the protection it affords to both social media companies and their users.

Imagine a world where every tweet, comment, or post you see is unfiltered—this episode contemplates such a scenario as we scrutinize the possible outcomes of a high court ruling concerning free speech on social media platforms. Our guests, armed with years of expertise, help us navigate the choppy waters of private enterprise versus public discourse, dissecting the responsibilities platforms hold as modern public squares. The session also reveals how adopting decentralized marketplaces and user-selected filters could empower listeners and revitalize the marketplace of ideas.

As we sail towards the horizon of this thought-provoking journey, we confront the looming question: Should social media be regulated as public utilities or should they remain private entities with editorial control? This episode doesn't just analyze the potential legal ramifications; it also evaluates the economic principles that could redefine governance in the digital realm. Our experts, Chang and Van Alstyne, provide an enlightening perspective on the delicate balancing act that lies ahead, ensuring the integrity of our online communities while fostering a robust environment for free speech. Join us in this riveting discussion that promises to shape your understanding of today's digital society.

J.P. Matychak:

Hello everyone and welcome to another episode of the Insights at Questrum podcast. I'm JP Matychak. Joining me, as always, is my co-host, Shannon Light. Shannon, how are you Great Again, I forget to bring you up how are you doing, Shannon?

J.P. Matychak:

I'm good. Thank you, excellent. Well, I know we're both excited for this topic. It's incredibly interesting, wide-reaching implications and incredibly timely, which is always good. So today's topic is an important one in the sense that it can have a lot of implications for online speech.

J.P. Matychak:

So earlier this week, the United States Supreme Court heard oral arguments in the cases of Moody v NetChoice and NetChoice v Paxton. Two cases stem from disputes over Republican-backed laws in Florida and Texas, passed back in 2021, that looked to restrict social media companies from moderating content on their platforms. So NetChoice is one of several tech groups that represent some of the world's largest platforms, like Facebook and X. They filed lawsuits against the states, citing these restrictions as violations of their First Amendment rights. So we're joined by two great guests to help us make sense of these two cases and the potential impact of the potential SCOTUS decisions. Kabrina Chang is clinical associate professor of markets, public policy and law. Professor Chang's research focuses on employment matters, in particular, social media and how that impacts employment and management decisions, and on corporate social advocacy. Her work has been published in academic journals, news outlets such as the New York Times, bloomberg and the Boston Globe, and in magazines such as Forbes and Harvard Review. Kabrina, welcome back to the show.

Kabrina Chang:

Thank you for having me.

J.P. Matychak:

So also joining us is Marshall Van Alstyne Allen and Kelli Questrom Professor in Information Systems, he is one of the world's foremost experts on network business models and co-author of the International Best Seller Platform Revolution. He conducts research on information economics, covering such topics as the economics of speech markets, platform economics, intellectual property, social effects of technology and productivity effects of information. He's been a major contributor to the theory of two-sided networks taught worldwide, and to the theory of platforms as inverted firms applied in antitrust law. More recently, marshall was the recipient of a $550,000 grant from the National Science Foundation to study misinformation and technology-aided societal structures to decrease the adverse impacts of fake news.

Marshall Van Alstyne:

Marshall welcome JP. Thanks for all these, Come join you.

J.P. Matychak:

Excellent. So I want to start by talking a little bit about a foundation, if we could the level set, a topic that kind of came up in a cursory way in the oral arguments, although you know immediately so may not apply here, but I think it's important just to level set where we are currently with statute. I want to talk about Section 230 of the Common Communications Decency Act. Kabrina, can we start with you just to explain to us what is Section 230 from a legal standpoint, from a statute standpoint?

Kabrina Chang:

Sure. So Section 230, as you said, is part of the Communications Decency Act. The CDA was passed in the mid-90s, essentially to combat child pornography online, and while sort of Congress was debating what the wording of the CDA would be, at the same time there's this case happening called Stratenokmont versus Prodigy. If our listeners have ever seen the Wolf of Wall Street, that's Stratenokmont, a story about Stratenokmont. So it was back when there was Prodigy and Prodigy had a chat room and it was like a financial services investment chat room and people had posted on that chat room comments about Stratenokmont that Stratenokmont thought was defamatory, like they were frauds and criminals which you know. We later learned a lot about them. So Stratenokmont sued and saying Prodigy, you're liable for this. And Prodigy defended, saying I'm not liable for this. We didn't post it and it's third-party content Before 230 was around.

Kabrina Chang:

so this was all sort of happening at the same time, short period of time, and the court said well, hold on, prodigy. Actually you're more than just a mere platform. Your user agreement says that you reserve the right to take down posts that are harassing or insulting, so you are exercising editorial control. If you are getting the benefit of editorial control, you're going to get the risk of editorial control, and that risk is you're going to be liable for defamation. So that case came down and legislatures were legislators, were, you know, worried that we would never grow this social media industry if every platform had to be worried about being liable for everything that was set on their platform.

Kabrina Chang:

So the CDA was written and again all in a very short period of time. When the CDA was passed it was the ACLU and a bunch of librarian networks sued, saying the CDA violated the First Amendment because you can't tell people what they can and cannot post. So the CDA was essentially gutted, except they amended it to include Section 230. So it was gutted of all of the restrictions regarding pornography and obscenity and stuff like that. But the liability protection for social media was the one thing that was preserved, essentially in reaction to the Stratenokman case. So it has that sort of a little bit of a dramatic history. But 230 is really the only thing that has survived from that. Yeah.

Shannon Light:

And Marshall, you've done quite a bit of research lately. That's talked about Section 230. What can you tell us about the impact it's had on these platforms and the companies that own them?

Marshall Van Alstyne:

Well, the impact. It meant the two sections or the two portions of Section 230,. There are two different components. One of them is that they're not liable for what users post. That's the first portion of it, but they're also not liable for their own editorial decisions.

Marshall Van Alstyne:

That's allowed organizations like TikTok, like Facebook, like Instagram, to grow almost without bound because, in some sense, whatever happens on there, they're unrestricted and they're not liable for, and you can also appreciate the magnitude of the task. How would you moderate 500 million messages a day? That's almost impossible. Maybe it's clear that this is enabled in the internet economies we know of today, especially, user-generated content is generated by users and the platforms are not liable for that, but at the same time, that also means we get a lot of information pollution and some things that they don't necessarily want to propagate. You're probably familiar, of course, with the whistleblower testimony of Francis Hogan before Congress, and perhaps Facebook was promoting speech that you wouldn't want to promote. So it's another interesting question. They've been shielded, then, from their own editorial choices in the way that a newspaper would not be.

J.P. Matychak:

Very interesting. Okay, so now that's good foundation to kind of take the conversation to these two particular cases. And so let's shift gears a little bit. And, Marshall, I want to start with you because I think it's important to lay the foundation as to what was happening in the world at that time when these two laws were quickly pulled together and passed. So if you could talk a little bit about the circumstances that brought about these two laws that were passed in Florida and Texas and the specifics of these laws before we shift to the actual arguments in these cases.

Marshall Van Alstyne:

Well, I will say you're asking the economists about the laws.

Justice Barrett:

I would defer Kabrina on on the law about that.

Marshall Van Alstyne:

But to give you a little bit of context, a lot of this happened in the wake of the insurrection a lot of Trump getting deplatformed. So to give you a little more specifics, in Texas, the law is basically trying to ban viewpoint discrimination. A lot of folks on the right feel that conservative voices have been suppressed, they have been censored and in fact, as some evidence, once Elon Musk took over from Twitter, they released the Twitter files that show that in fact, there had been some suppression of conservative speeches. In particular, it was on the New York Post story around Hunter Biden's laptop. It had been pressed on Twitter and Facebook as a potential Russian disinformation campaign.

Marshall Van Alstyne:

Similarly, in Florida, after Trump got deplatformed, that law was written in such a way to make it extremely difficult for platforms to deplatform politicians. The law is written such that you can't deplatform either journalistic enterprises or folks running for office. So again, in effect, if you want to misbehave, run for office and you can't be deplatformed. There's no interesting element. The upshot is what's now before the Supreme Court? Is this interesting question of should they platforms be treated as common carriers like AT&T, which has to carry everything and for which they're not liable, or should they be treated more like publishers that make editorial choices and therefore bear the consequences of those editorial choices. At the moment, they have editorial choices but no liability, and so there's a question again should we treat them as common carriers or as publishers? That's the core of the issue.

J.P. Matychak:

So let's now fast forward here to these two cases. And we have NetChoice. The group that's representing a number of different tech companies Files a lawsuit against the attorneys general of Florida and Texas, citing these laws as unconstitutional, in violation of First Amendment rights. Can you talk to us, cabrina, a little bit about these central arguments in these cases and particularly around this free speech issue? And yeah, I'm going to play a clip, but I want to do it after. Maybe we chat first, ok sure.

Kabrina Chang:

So both parties are making First Amendment. Claims the states and NetChoice.

Kabrina Chang:

Essentially, the states are saying that social media is a public square and as a public square, like Boston Common or Public Street, as a public square, you cannot censor information based on political view and a bunch of other things. Interestingly, there was a case from 2017 in the US Supreme Court called Packingham vs North Carolina, which North Carolina passed a law restricting registered sex offenders from accessing social media. They addressed the First Amendment, but only insofar as saying the law in North Carolina was written so broadly that it's unconstitutional. It's not narrowly tailored to achieve the goal that they want to achieve. That everyone agreed was a good goal. However, there's some interesting language in the Packingham case that says it's not just a First Amendment right to speak on social media, it's a First Amendment right to access it, that everyone most people in 2017, get their news, communicate, learn, participate in the marketplace of ideas, even if it's texting ridiculous things or tweeting, whatever kind of extreme viewpoint you might have. So that case is out there, which is a pretty important case for this.

Kabrina Chang:

So the states are saying social media is essentially a First Amendment space. The companies, however, are saying no, we're not. We are private companies and net choice would take issue with Marshall saying it's censorship. It's not censorship, it's editorial control, because only the government can censor. So the companies are saying we are private businesses, we can exert editorial control. In PS, If you're passing a law that tells us we have to post things, that's a violation of our First Amendment rights. It's called compelled publication. You cannot compel speech. We are private individuals with First Amendment rights and forcing us to say things through our business is a violation of our First Amendment rights.

J.P. Matychak:

So I want to touch on this public square thing and I want to go to a clip from Justice Jackson.

Justice Jackson:

Back for a minute on the private versus public distinction. I mean, I think we agree that the government couldn't make editorial judgments about who can speak and what they can say in the public square, but what do you do with the fact that now, today, the internet is the public square? And I appreciate that these companies are private companies. But if the speech now is occurring in this environment, why wouldn't the same concerns about censorship apply?

NetChoice:

So two reasons, your Honor, I mean. One is I really do think that censorship is only something the government can do to you, and if it's not the government, you really shouldn't label it censorship. It's just a category mistake. But here's the second thing. You would worry about this if websites like the cable company's interner had some sort of bottleneck control where they could limit your ability to go to some other website and engage in speech. So if the way websites worked was somehow that if you signed up for Facebook, then Facebook could limit you to only 19 other websites and Facebook could dictate which 20 websites you saw, then this would be a lot more like term.

J.P. Matychak:

So directly to your point, right, yeah, right, and Marshall, your sort of thoughts on this and this categorical mistake. And in both of you, actually, it's not censorship, it's just a categorical mistake.

Marshall Van Alstyne:

He's relabeling a rose here. So it's going to be one form of censorship. For another word, it's done by one government or private institution, so I wouldn't accept that argument at all. The other thing we have to be careful about is that in so many of these cases you're always encouraged to do counter-speech. So in the region, what would you have done? You would have set up your own printing press to offer to reach a separate audience.

Marshall Van Alstyne:

The problem is, in this internet economy we're now dealing with network effects and other really strong monopolistic-style markets. So take, for example, the market power of anyone trying to set up a social network. Try to set up a competing social network now to reach another audience. Another element that's somewhat different influencers bring their own audiences. If the platform is to interpose itself between you and your audience, that's a form of censorship. So it's not fair for him to reclassify it because a category error when in fact, folks are simply reaching their own followers. And so the challenge that they have in making that argument is that, yes, they're exercising their own first amendment rights, but they're exercising their first amendment rights over your first amendment rights, and that's the challenge you still ought to be able to reach your audience that you brought to the platform.

Shannon Light:

So let's continue this conversation on free speech and listen to two clips, the first clip being from Chief Justice Roberts, in the second, from Justice Kavanaugh talking about the First Amendment.

Chief Justice Roberts:

So you began your presentation talking about, concerned about the power, market power and ability of the social media platforms to control what people do. And your response to that is going to be exercising the power of the state to control what goes on on the social media platforms. And I wonder, since we're talking about the First Amendment, whether our first concern should be with the state regulating what we have called the modern public square.

Justice Kavanaugh:

In your opening remarks you said the design of the First Amendment is to prevent suppression of speech, end quote. And you left out what I understand to be three key words in the First Amendment, or describe the First Amendment by the government. Do you agree? By the government is what the First Amendment is targeting.

State of Florida:

I do agree with that, your Honor, but I don't agree that there is no First Amendment interest in allowing the people's representatives to promote the free exchange of ideas. This court has recognized that as a legitimate First Amendment interest.

Shannon Light:

So, Kabrina, based on what we're hearing, how might the High Court's ruling reshape the legal landscape of free speech and online speech?

Kabrina Chang:

That's a great question and really difficult to answer, because if the state's win and social media platforms are a public square, that has to be treated like government property, quasi-government space, what does that mean for other businesses? It's a slippery slope. Where do you stop? And social media we around on this podcast might be thinking Facebook and WeChat and Instagram and X, but social media has a definition about creating a user profile, being able to post things, being able to communicate. So that could be a lot of things. That could be some websites, retail websites where you can post reviews and interact with other people who posted reviews.

Kabrina Chang:

So it's a significantly slippery slope when you think about what are they talking about when they say social media? Because that's one of the problems with the Florida law is the Florida law is so broad that the justices are wondering if that includes Uber, and can Uber not pick up a customer because of what they think the customer's political views might be? So that is something that really would have to be very specifically worded and narrowly tailored If net choice were to win. And their private companies? You do have this strange reality of a very small handful of companies sort of controlling what we see and, to a great extent what we do, how we feel, and that is really an unsettling feeling.

Marshall Van Alstyne:

So let me jump in with a thought on that. I think Cabrinha has identified a genuine problem, and I think lost in this debate is the voice of the listener. All too often this is positioned as the free speech rights of the speakers versus the free speech rights of the platforms. But the listeners also have a right. It goes all the way back to Frederick Douglass about people saying slavery wouldn't stand up if people could actually talk about it. If you can't censor it, then you can endorse it, then you can support it, but the listeners, if they can hear Contrarian voices, they can hear what they want. Then you get different outcomes, and the listener's voice has been left out of this.

Marshall Van Alstyne:

One of the things that John Marbles was talking about initially in your recording was the market power of the platforms and using the state to intervene to correct the market power platforms. The problem is both are wrong. We don't want the state to control our speech and we don't want private enterprise to control our speech. So what I would argue as an economist is we need to actually get back to a point where we can create decentralized marketplaces where no one is in control. Let me give you one stepping stone toward a possible solution, though.

Marshall Van Alstyne:

So it's partly involving jurisprudence, partly involving legislation, but imagine you, as a user, had the right to choose any filter that you wanted. You could choose BBC or Consumer Reports or Breitbart or Fox News, as you wanted. Then the infrastructure could be protected, but the users can choose the filters and the algorithms that they want, and you could create a genuine marketplace of ideas where the users are starting to choose those things, and not the government and not Mark Zuckerberg and not Elon Musk. You've got a balance of choices between the speakers and the listeners, creating a truer marketplace. So I think that might be one way and again, I haven't heard this as part of the discussion but we need to elevate the rights of listeners to this and create a marketplace on top of these platforms. We can go into some of the details how we do that, but I think that's a better approach to the problem.

J.P. Matychak:

So I want to, and I think that that is a bit about an underlying theme within this whole public entity, public utility, common carrier piece that kept coming and being talked about, and so let me play a clip real quick from Justice Gorsuch and his questioning of the state solicitor from Florida.

Justice Kavanaugh:

You've analogized to common carriers and telegraphs in particular. Why is that an apt analogy here, do you think?

State of Florida:

I think it's an apt analogy, your Honor, because the principal function of a social media site is to enable communication, and it's enabling willing speakers and willing listeners to talk to each other, and it's true that the posts are more public. But I don't think that Verizon would gain any greater right to censor simply because it was a conference call. I don't think that UPS or FedEx would gain a greater right to censor books because it was a truckload of books as opposed to one book, and so the analogy is indeed apt. And so there's been talk of market power. Market power is not an element, I think, of traditional common carrier regulation, and indeed some entities that are regulated as common carriers, like cell phone providers, operate in a fairly competitive market.

J.P. Matychak:

So I think that this kind of gets to a little bit of what you're talking about, marshall, is this whole notion of are these news agencies? And look, I'm not a legal scholar, right, and I'm not an expert in sort of misinformation and platforms but as I listened to these arguments and heard this public utility versus private entity, I really started to get confused a little bit and a little bit just questioning my own thinking a little bit, to put it that way, because I started thinking well, if they really see themselves as editors, are we who post on there all sort of freelance journalists, getting to post our stuff but then they get to kind of say what we see, what you don't see, and take that editorial control? Or are they more like a facilitating platform for us to share these ideas? And it does. You really can see the arguments on both sides of this issue. You really can. So, thoughts as you listened to this piece of this argument of the public utility versus this private company, I don't think it's any of the above and I think your confusion or your struggle is a legitimate struggle right.

Kabrina Chang:

You know the New York Times or Breitbart. They have editors, they have mainstream media. News media has fact checkers and they have standards and everything else. Tv has the FCC, so Janet Jackson gets fined.

Kabrina Chang:

Social media is none of the above and I really am interested to hear Marshall's take on this. My take on this is you don't have Verizon or the New York Times, with engineers manipulating the platform every day. Like every time we look at websites, they are manipulating what I see. So it's not me using Verizon to call JP, and Verizon is neutral. Technology is not neutral and Verizon is neutral. That's not what's happening on Instagram and Twitter. They are manipulating us every single time. They have engineers using neuroscience to keep us addicted, to keep us scrolling, to hit us with dopamine to keep us on there, because that's how they make money. The revenue model incentivizes us to stay on there and the divisiveness drives engagement. Engagement drives ad revenue. So it is not a neutral platform, it is not a highway, it is not a phone line. I think that is a misleading argument and ignores the very business model of social media as they are currently which is what I think these laws are potentially trying to address in some way.

J.P. Matychak:

You can talk about that You're censoring me and this and that, but I think that that's core to some of this argument. That is as it is currently. You do have this model, but is that the model that should be?

Kabrina Chang:

Even the laws in Texas and Florida do nothing about the science behind the manipulation. That's just the content. So I just don't think it's an apples to apples comparison, and it is a legal argument that has been going around and around in other cases where plaintiffs are trying to impose liability on social media companies for physical injury that they've sustained from content online that has moved into real life.

Marshall Van Alstyne:

So let's jump in with two quick thoughts. The first thought was just picking up on the arguments of the attorney here. They're wrong about some of these arguments. They say there's no monarchially power. If you go back in the history of AT&T, they were broken up because they had monopoly power and that was one of the reasons we got the common carrier roles. You want everyone to be able to have access. Corinna is also completely right. These are not neutral conduits. It's not like a phone line, it's not like a telegraph channel. They are indeed manipulating for private gain. There's in the wonderful phrase describing what they do. If it's enraging, it's engaging, and they use the machine learning algorithms to promote the most enraging, most engaging content. They are allowing individual users to light fires so they can pour on gasoline and sell ads where people watch as the neighborhood burns. So it's a rough model to work on and they are in fact engaged in policing content. But back up a moment. We also do need somebody to police the content.

Marshall Van Alstyne:

We need them to take down terrorist recruiting, sex trafficking, pro-suicide models. We need that kind of thing. Somebody needs to take responsibility for that. As an economist, I like to reposition this as a different kind of problem. I would call it a pollution problem. Through all these externalities. These are the damage that occurs off-platform. It's insurrections that occur off-platform. It's lynching that occur off-platform. It's the loss of herd immunity that occurs off-platform.

Marshall Van Alstyne:

When Zuckerberg spoke before Congress, he said, and I quote, we didn't take a broad enough view of responsibility. Well, that's clearly an example of an externality. Now here's my explanation to why we're at such an impasse, but the good news is it then leads to some solutions. My view is why we're at such an impasse is that we do need content moderation, but the design of Section 230 is such that no one is responsible for the pollution problem and no one can be held accountable for it. If you're in printer and broadcast, you are liable for your editorial decisions. In social media you are not. You're not liable for the user's content and you're not liable for your editorial decisions. So we have two proposals to try to fix this kind of problem, one of them legislative, one of them for the courts. But the first is to make this decentralized, so it's not one individual part. I don't want Elon Musk or I don't want Mark Zuckerberg or the head of TikTok choosing with his machine learning algorithms what I get. That's why I want a true marketplace in listener-decided algorithms and listener-decided filters. It could be open source ones that you yourself could modify, and then you'd get a true exchange. What you then need to do is to reattach the liability to the editorial decisions. So if we separate the infrastructure, of course we grant them complete immunity, the same way the common carriers have it now, but we reattach the liability to the editorial decisions so that if there are lies and defamation that are happening, that someone can in fact be held liable. The technologist then objects, of course.

Marshall Van Alstyne:

How do you deal with 500 million messages a day problem? Once you see it as a pollution problem, it's actually really easy. You hold them to a flow rate of accountability. If a factory is putting out dioxin, you hold them to a flow rate, not every molecule.

Marshall Van Alstyne:

If a doctor checks your cholesterol, he or she takes a blood sample. They don't take every drop of blood. It's not possible. The way you do it is to hold them to a flow rate of pollution, and then you can deal with 500 million, a billion messages daily. But if you reattach liability to editorial decisions, then we have someone that's responsible for the pollution problem and we can in fact hold them accountable. And so I think this combination, within which we preserve the liability, the absence of liability, for the infrastructure in the common carrier, we create a marketplace on top and anyone can have the filters that they want, but then those filters do attach editorial decisions on a flow rate, can actually help solve the problem. But this is a combination of court decisions and legislative decisions to get us to a marketplace that becomes self-cleaning, as opposed to having government or Elon Musk do it.

Shannon Light:

And I think this is a good time to listen to an exchange that justices Alito and Kagan have with the attorneys representing that choice.

NetChoice:

The Florida law covered Gmail.

NetChoice:

The Florida law, I think by its terms could cover Gmail All right.

NetChoice:

So does Gmail have a First Amendment right to delete, let's say, Tucker Carlson's or Rachel Maddow's Gmail accounts if they don't agree with his or her viewpoints?

NetChoice:

They might be able to do that.

NetChoice:

Your Honor, I mean, that's obviously not something that has been the square focus of this litigation, but lower courts if they don't, then how are we going to judge whether this law satisfies the requirements of either Salerno or Overbreath?

NetChoice:

So it's you know. Again, I think it's the plainly legitimate sweep test, which is not synonymous with Overbreath. But, in all events, since this statute applies to Gmail if it applies at all, because it's part of Google, which qualifies over the threshold, and it doesn't apply to competing email services that provide identical services, that alone is enough to make every application of this statute unconstitutional. I mean, how could that be? Go ahead.

Justice Kagan:

How could that be, Mr Clement? It's not unconstitutional to distinguish on the basis of bigness, right?

NetChoice:

It is when you're regulating expressive activity. That's what this court said in Minneapolis Star. So the statute in Minneapolis Star was unconstitutional in all its applications.

Justice Kagan:

The statute You're saying if there were no issue here of that. This is really a subterfuge. They were trying to get at a certain kind of media company because of their views and the only issue was it's not worth it to regulate a lot of small sites. You know, we only want to go after the big sites that actually have many millions of users. You think that's a first amendment violation?

NetChoice:

I do. The way you're asking the question suggests you think that's a harder case than the one I actually have before you.

Justice Kagan:

I think it's a little bit of an impossible case to say you can't go after big companies under the first amendment.

Shannon Light:

So, on that Marshall, how do you see the classification impacting the way social media platforms moderate this content?

Marshall Van Alstyne:

I'll be candid as an academic, I have never like thresholded tests of that sort, because you'll get really exotic behavior on either side of the boundary. You know, and I think that's perhaps not the, the bigness test alone is not a good test. What we would want would be the equivalent of market power tests, of consumer welfare tests. Is harm occurring? Is there anti competitive behavior? Is there suppression of speech? Those, I think, are better ways to look at the problem than bigness alone. So I think that argument is not a legitimate argument, sabrina.

Kabrina Chang:

Yeah, and much of First Amendment jurisprudence, size doesn't necessarily matter and in fact in other First Amendment cases, like Citizens United, you know, the Supreme Court was very clear in saying in their majority that this business has a First Amendment right, just like we human beings have First Amendment rights. And if we're talking about money, as in Citizens United, the fact that they have more money than me is not part of the First Amendment analysis.

J.P. Matychak:

Okay, so I want to talk now a little bit about the options that are kind of before the court, and one of the things that seemed to be a recurring theme throughout the oral arguments was just how broad this was. I want to play two clips, one from Justice Sotomayor and the other from Justice Barrett.

Justice Sotomayer:

This is such a odd case for our usual jurisprudence. It seems like your law is covering just about every social media platform on the Internet and we have a me guy who are not traditional social media platforms like smartphones and others who have submitted a me guy brief telling them that readings of this law could cover them. This is so, so broad. It's covering almost everything.

Justice Barrett:

So Florida's law, so far as I can understand it, is very broad and we're talking about the classic social media platforms, but it looks to me like it could cover Uber.

Justice Barrett:

It looks to me like it could cover just Google search engines, amazon web service and all of those things would look very different. And Justice Sotomayor brought up Etsy. It seems to me that there are now. Etsy has a feed recommended for you, right, but it also just has shops for handmade goods that you can get. It looks a lot more like a brick and mortar marketplace or flea market than a place for hosting speech. So if this is a facial challenge and Florida's law indeed is broad enough to cover a lot of this conduct, which is farther away from expression than these standard social media platforms, why didn't you then, in your brief, kind of defend it by pointing out look, there's all this other stuff that's perfectly fine that Florida covers. We don't want, you know, some person who wants to sell their goods on Etsy to be suppressed because it's, you know stuff, handmade goods that express a political view.

J.P. Matychak:

So let's start. I have two questions, one being what is a facial challenge? And then, two, let's talk through here what the options, the options that the justices have with this, these two particular cases.

Kabrina Chang:

Sure, I think they are related. So when you're challenging a law as unconstitutional, it generally takes one of two approaches a facial challenge or as or an as applied challenge. So an as applied challenge would be something like hey, we're Instagram and your law is unconstitutional as it applies to our speech and here's our speech and all this other stuff might be fine, but these sections, as they apply to me, are unconstitutional. And usually when you have a challenge like that because if I'm Instagram and I'm saying as applied to me, it's unconstitutional I have some evidence and I have a record If you're doing a facial challenge, what that means is you wrote this law. This whole thing's unconstitutional. It could be sections of it too, but looking at this law, it's unconstitutional, and so you are just challenging the law as it's written, not as it actually is applied to a person, and that's fine. It's just a slightly different kind of legal argument and it's a different evidentiary record and so often with facial challenges.

Kabrina Chang:

Well, in this case, I think what the justices are saying in terms of a facial challenge is we don't have a lot of evidence about how these laws actually play out, because you're just challenging them.

Kabrina Chang:

They were just written in past. We have no, we have no application to an individual person and how that impacted them in their life or their business. And so I think, listening to the oral arguments, several of the several of the justices were a little bit concerned that they didn't have a tremendous record on which evidentiary record on which to base a decision. So, leading into the second question about how could this play out, one option is they're going to remand it. They're going to send it back to Texas and back to Florida and say build up a, build up an evidentiary record, because we don't have enough here to make a good decision on whether or not this law as written is unconstitutional. I think also, what they're getting out with Florida is, when you're talking about the First Amendment, the law has to be what's called narrowly tailored. It has to address the speech that you're you're you're talking about it has to address the speech that you mean and nothing more.

Kabrina Chang:

It has to be narrowly tailored to do just what you want it to do. And I think what Justice Kagan was saying was this is not. This is overbroad.

Kabrina Chang:

They kept mentioning overbroad this is overbroad, so they could remand it, which sounded like some of the justices wanted it remanded so they could get a better evidentiary record. I mean, the other thing they could do, like most appellate courts, is they could overturn the decision of the lower court for a variety of reasons. They could also remand it. They could make a decision and then remand it back down and say, have another hearing consistent with the decision we just told you and and the reasons we think you got it wrong the first time and and and.

J.P. Matychak:

As I understand it right now these cases are before them because injunctions were filed by by the, the groups, the net choice, and they granted the injunctions until they could hear the court the case. So in this sort of going back down and getting more evidence to do the injunction stay, do they vacate the injunctions and allow these laws to kind of go in into into place so that they can get the evidence that you know in this record and you know how do you see that potentially playing out?

Kabrina Chang:

well, so they could still keep going with the facial challenge. You know, to get a preliminary injunction. It's not. One of the elements of getting a preliminary injunction is showing a like, a likelihood of success on the merits of the underlying case. So there's something there, but it's a likely. They have to show other things, but there's a likelihood of success, so it's not a ton of evidence. So if they were to send it back down, I would think that they would still keep the injunction in place and ask for more development okay, one or two other quick thoughts on.

Marshall Van Alstyne:

You can appreciate how hard this problem is. You know the Texas and the Florida laws are relatively similar and it's interesting that the Fifth Circuits and the Eleventh Circuits found opposite directions, one upholding the Texas law and one rejecting the Florida law. So the you know the circuit courts actually came to different decisions and they're having a really hard time. The categorical issues here you know speech on Etsy, speech on Uber, speech on these other things are you doing viewpoint discrimination? Well, does this include pro-ISIS statements? Does it include pro-Israel statements? How are you? You do you really want states or governments making those choices? Those are going to be. Those are going to be challenges they're going to face under First Amendment law that are really very hard to meet. But then there are the broader questions of common carrier versus needing some sort of editorial stuff to produce the truly illegal content, and who then has the right to do that. This is going to be well, this is going to be a tough set of choices for them.

J.P. Matychak:

Yeah, and I think that's a good segue to your. You had a final question you wanted to ask yes, well, quasi-final, because I have one more.

Shannon Light:

If these laws are allowed to stand, how might I imagine these decisions could have far-reaching impact on the role of social media platforms in society? Can you maybe lay those out for us?

Marshall Van Alstyne:

Well, the consequences of one or another decision are immense. You know you play it out in a couple of different dimensions and I'll be honest, as an economist I do not feel comfortable calling how the Supreme Court is going to rule one way or the other. So I don't want to make any predictions on that. But if you were to fork the decision suppose, for example, platforms are ruled as common carriers then they're not going to necessarily. Then they're going to have to carry pretty much everything. And interestingly enough, you know there could be a number of responses.

Marshall Van Alstyne:

There's someone at Stanford that had a marvelous observation. They said okay, one thing you could do is simply cut off service in Texas and Florida and see what happens. Or you could give them the firehose of porn and spam and pro-ISIS and pro-suicide content that they've asked for, Because that's literally what the law has required the platforms to do. At the other extreme, if they are granted their rights to edit the speech, then they would continue pretty much unimpeded as before. It would be interesting to see. I would be surprised if it were no caveats whatsoever, although there were some other interesting pair of cases. There was Gonzalez for Google and Twitter Vitaum earlier, you know, trying to do it and they were allowed to remain immune from accountability for terrorist speech that had been recruiting and things had taken place on their platforms. So under those decisions I imagine they might still be able to continue, because that's in some ways even more harmful than some of the political speech at the moment that we've heard so far.

Kabrina Chang:

But also and I wanted to add to what Marshall was saying if the decision is whether or not social media platforms are a public square, you know the government can still regulate speech. Right, the government regulates speech all the time. We can't incite imminent lawlessness there are hate crimes, I mean. So commercial speech is regulated. So it's not that it would be without regulation, it would just be the regulation that we see, which some people are not happy with, you know, I mean the Westboro Baptist Church protesting at funerals of veterans, like no one. There are very few people who are happy to see that. But that's what our First Amendment does. So it's not like there's no regulation. There's some, but it is not what Facebook and YouTube do currently with getting off the torture and pornography and that kind of content currently.

J.P. Matychak:

And it seems like that was something that was brought up on occasion during the oral arguments this whole notion that there are probably some areas of content where we can all agree that we don't want that out there. But it goes back to this whole other thing. When it's not narrowly defined, it's like how do you enforce this?

Kabrina Chang:

And that's the part that just well, in Marshall's point, who's we exactly? We can all agree, but I don't know there's someone posting that. But if, with Marshall's idea of we get to pick our own filters, who is we?

J.P. Matychak:

Yeah, so Marshall has abstained from taking a guess. As an economist, you as a lawyer and a legal expert. Where do you think that this falls? The likelihood of being remanded or dismissed?

Kabrina Chang:

I listened to much of the oral argument and the questions were all over the place and there was absolutely no. I could see no consistency with sort of traditional conservative versus liberal ideology. And you know the justices that with most of their decisions it's always a mismatch, a mix match, like it's not really an ideological divide on many of their cases. But I couldn't figure it out. It would not surprise me if it were remanded. I don't know if that's what they'll do, but I wouldn't be surprised. But that's really the only that's as close as I'm going to get.

Marshall Van Alstyne:

And it's, let's say I don't want to call what the Supreme Court I think will do, but I can say what I would prefer to see have happen, so that I'm entirely willing to take a stand on. So again, what I would like to do is to just bear on a couple of economic principles. I don't want monopoly coverage of speech, and a lot of social networks have undue market power so it can recede decentralized markets where no one has control. I don't want to see pollution problems in speech where no one's held accountable for lies and defamation. We need to restore some accountability for things that have happened.

Marshall Van Alstyne:

So, again, some of the proposals I would like to see put in place and they may be legislative as well as judicial would be to preserve immunity for the infrastructure, as we have for the common carriers, but to restore some of the liability for editorial decisions, but do so, again, as we said, on a flow rate basis, which then makes it easy to handle at scale, so you can actually then recover the scale issues and so we get a true marketplace where it's decentralized, listeners can get the information they want, you can get clean information, you can get the information that you want, and so no particular party is controlling it. But we also clean up the pollution at the same time. So we solve the monopoly problem and we solve the pollution problem at the same time. So that would be the economist's answer to this rather difficult challenge.

J.P. Matychak:

It certainly is. It's incredibly complex and it's making us, I think, all think about how we you know what we say, how we say it, where we say it and what is the underlying motivation for how these platforms operate. So I want to thank Marshall and Kabrina for joining us today. Thank you both so much. This was great. I certainly learned a lot more. Hopefully our listeners did as well in Shannon as well.

Marshall Van Alstyne:

So that's a real pleasure. Thanks for having us.

Kabrina Chang:

Thanks for having me.

J.P. Matychak:

Well, that's going to wrap things up for this episode of the Insights at Questrom podcast. I'd like to thank again our guests Kabrina Chang, clinical Associate Professor of Markets, Public Policy and Law, and Marshall Van Alstyne Allen and Kelli Questrom, professor in Information Systems. Remember for additional information on this show, our previous shows and additional insights from Questrom faculty on the world of business, visit insights. bu. edu For my co-host, Shannon Light. I'm JP Matychak, so long.