The Screen Lawyer Podcast

AI’s Role in 2023 Writers’ Strike: New WGA Contract #114

October 11, 2023 Pete Salsich III Season 1 Episode 14
AI’s Role in 2023 Writers’ Strike: New WGA Contract #114
The Screen Lawyer Podcast
More Info
The Screen Lawyer Podcast
AI’s Role in 2023 Writers’ Strike: New WGA Contract #114
Oct 11, 2023 Season 1 Episode 14
Pete Salsich III

Did you know that generative AI engines played a key role in extended negotiations during the recent WGA Writers’ strike?

In this episode, Pete Salsich explores the factors involved, the newly negotiated WGA contract, and the future of the industry.

Original Theme Song composed by Brent Johnson of Coolfire Studios.
Podcast sponsored by Capes Sokol.

Learn more about THE SCREEN LAWYER™ TheScreenLawyer.com.

Follow THE SCREEN LAWYER™ on social media:

Facebook: https://www.facebook.com/TheScreenLawyer
YouTube: https://www.youtube.com/@TheScreenLawyer
Twitter: https://twitter.com/TheScreenLawyer
Instagram: https://instagram.com/TheScreenLawyer

The Screen Lawyer’s hair by Shelby Rippy, Idle Hands Grooming Company.

Show Notes Transcript

Did you know that generative AI engines played a key role in extended negotiations during the recent WGA Writers’ strike?

In this episode, Pete Salsich explores the factors involved, the newly negotiated WGA contract, and the future of the industry.

Original Theme Song composed by Brent Johnson of Coolfire Studios.
Podcast sponsored by Capes Sokol.

Learn more about THE SCREEN LAWYER™ TheScreenLawyer.com.

Follow THE SCREEN LAWYER™ on social media:

Facebook: https://www.facebook.com/TheScreenLawyer
YouTube: https://www.youtube.com/@TheScreenLawyer
Twitter: https://twitter.com/TheScreenLawyer
Instagram: https://instagram.com/TheScreenLawyer

The Screen Lawyer’s hair by Shelby Rippy, Idle Hands Grooming Company.

On this week's episode of The Screen Lawyer Podcast, I'm going to dig in a little bit to the brand new agreement reached between the Writers Guild and the major studios, particularly as it relates to artificial intelligence, some really important provisions that are actually quite groundbreaking in the whole context of labor relations, but particularly as it relates to creative works. Stick around. Hey there. Welcome to the Screen Lawyer podcast. I'm Pete Salsich, The Screen Lawyer. And today I'm going to spend a little time digging into the new agreement reached between the writers Guild and the major studios. As many of you know, the Writers Guild went on strike over 148 days ago, longest strike in its history. And really important strike in a lot of ways. I mean, for, you know, just the simple fact that this is an example of how unions work and collective bargaining environments, and sometimes they do go on strike for important reasons and people can have all sorts of discussions about the validity of that. But the reality is in a situation in the entertainment industry in particular, these are very, very important methodologies that establish key rights for the creative people who work in these industries. And the writers were arguing about a variety of things in the course of a contract negotiation. And it's pretty typical that sometimes unions and management disagree about financial incentives or financial terms. But here, I think one of the key things that drove this whole thing was the just explosion of artificial intelligence. So if you think about the timing that we're dealing with here, this strike happened earlier this year. The prior agreement, these agreements are for three year terms. The prior agreement was coming to an end. And negotiations always start well before the the agreement is ended with the hope that a new agreement will simply replace the old one. Well, of course, late last year began the explosion of ChatGPT and generative AI appearing in lots of different environments very, very quickly. And of course, there are lots of people concerned about, well, is I going to take my job that sort of thing. Right. And there's been lots of discussions about that here at The Screen Lawyer. We've been kind of tracking the implications that AI has on copyright law primarily. And the Copyright Office came out relatively strongly early in the year with the clear ruling that explained that only works of human authorship are entitled to copyright protection. So a work that is generated essentially primarily by a generative AI like ChatGPT doesn't have any copyright protection. It essentially it instantly goes into the public domain because there was no human author. Now, there are variations. If a human adds significant work to it, uses it only as a starting tool and then rewrites and finishes it well, then, yeah, there may be more than enough human authorship to make the work protectable by copyright. But that's the continuum. And early on in the negotiations for the Writers Guild agreement before the strike, the Writers Guild was proposing certain terms that we mentioned on our podcast episode a while ago that would deal with these issues. Well, now the agreement is out and the guild got all of those terms essentially, which is pretty important. And let's take a step back, too, because I think this is something that's worth noting. It's not so much a copyright issue, but it's a tech issue and in most industries and sort of overwhelmingly, the workers don't have any say whatsoever about the inclusion of New tech in the environment, whether it's robotics or in this case, AI or other things that management just says, hey, we're using new tech workers adapt to it. And that's pretty much the case. And then you're fighting over, well, does the tech apply this way or that way? But here, as the tech is being developed, the workers were given essentially the ability to sit at the tables. Now, I'm going to get into in a minute some reasons why the studios ultimately needed this to be here now and not wait. But it was interesting to include the workers in this emerging tech environment because in a way, it hasn't even really landed yet. Interestingly, in the particular part of the WGA agreement that addresses AI, the opening statement basically is an acknowledgment by both parties that this technology is still being developed. It's not fully been worked out, and it even reflects a recognition that the courts haven't made out. So they've even built in a twice a year review process where the representatives of the WGA and the studios will evaluate what are the courts saying, what is the tech happening? And they reserve the right to modify certain provisions. I don't think change them consistent with the main principles, but to adjust as the tech itself adjusts, as courts rule on copyright infringement, things, stuff like that. If we see new legislature, a new version of the DMCA, for example, that governs AI - that would perhaps impact the WGA agreement in the future, I think it'll certainly impact the next agreement three years from now, but the precedent has kind of been established. AI is on the table, human authorship is protected, and that, I don't think is going to change. And I think that's pretty important. Now, there's a couple of sort of baseline principles here, and I think in a way, you know, it can almost be summarized as simple as a I cannot replace the human writer and AI cannot be used to minimize the wages or the revenue or the compensation for writers. You can't pull a AI in and therefore say the writer did less and under the formula of the agreement, therefore the writer gets paid less. That's not permissible. Pretty profound principles. AI cannot replace the human writer, and it cannot be used to reduce their wages. I think that could have implications far beyond the entertainment industry, and it'll be interesting to watch. But I think those are important things to keep in mind here, because as we've been talking about all along, this is still being worked out. Courts have not weighed in on the ultimate fair use issue about scraping images and works from the Internet and elsewhere to train the large language models. We talked about some lawsuits that were filed earlier this year, the Sarah Silverman lawsuit being one of them that are going to test this very issue on the fair use principles. We'll be following that closely and we'll be talking about that more. But interestingly enough, in the WGA agreement, there's actually a provision that says nothing in this agreement prevents the writers from later claiming and suing to challenge the use of their material for training. So even if the writer writes works that are work for hire for the studio, if the writers contributions are scraped into a large language model training AI, it's possible that that writer may have a cause of action. And just being part of this agreement doesn't keep them from that lawsuit. Very important, right to have been reserved for the writers here. And I think it's a recognition that this is everybody still kind of waiting to see what happens. But there's another reason I think the students, the studios ultimately had to give in on this issue. And it's a basic copyright principle. You know, we've talked about- multiple times - about how the entertainment industry, certainly the screen industry, the music industry as well, they literally depend the entire economic structure depends on there being one owner of the copyright in the entire work with no gaps. So all of the people who contribute to that work sign work for hire agreements, footage that gets pulled in, music that gets licensed every single act that goes into the creation and every single output of those acts is done as a work for hire or an assignment or a license, whatever, to make sure that there is one copyright owner. Well, if the studios refused to acknowledge that human authorship is primary here, then and allowed AI to take a more active role in the creation, it's quite possible that they would not end up owning the copyright in the screenplay, for example, which means somebody else could make a movie based on the exact same screenplay because there's no copyright inf is non-human authorship. So I generated work, doesn't have copyright protection. Similarly, AI/ChatGPT is not a person. AI can't assign any rights that it might have in any work. So there's no way for AI generated work if it isn't considered human work to become part of the copyright owned in a film or television show by the studio or the network. So ultimately, at the end of the day, I think the studios had to agree to these provisions so that they could protect their own copyrights. I don't know if that's going to apply in other industries in quite the same way, but it's pretty interesting. It's not necessarily completely a victory for the writers. I think it's a recognition by everybody involved that the industry thrives and survives based on copyright law. And copyright law has made it very clear AI doesn't get copyright protection. So I think those are some of the big takeaways. A couple of other provisions and sort of so let's let's break this down. How does it deal with this? So a couple of defined terms and, you know, if you're interested in getting into this more, feel free. The WGA has you can download the entire memorandum of agreement. You can they've got a great summary that you can download it said wgacontract2023.org. You can find all this information there. But I want to focus in on a couple of key things that matter. So there is a few key terms. Source material is one and literary material is another. And so these terms get used in this year's agreement because they're fully understood to have meaning in the past. Literary material. Is this basically what a writer produces? It is the output of the writer's craft. If it is written on it. You know, a writer hired to write a new screenplay, there is no previous source material. Then the whole screenplay is literary material of the writer. But if the writer is writing a new screenplay based on an already existing book or something like that, then there is source material. And the source material is that book. The reason these things matter is because in the complex compensation system and credit system that is set up under the WGA, if writers use source material that they adapt, there's less you know, you in other words, you get paid more for writing entirely new material than you do for adapting source material. There's also essentially material that is essentially provided to the writer from the studio, whether it's source material or from other environments. For example, if a writer rewrites somebody else's original work, the second writer gets credit. But is that credit is only partial credit because the first writer who wrote the other material gets some credit as well. And this all matters because residuals, payments, second step opportunities, separate rights. These are all other defined terms. These all depend on credit. So, you know, you may think, well, credit is cool, right, Because you see your name on a screen, but it's not really vanity that drives it. The entire economic system is based on how you are credited. And as I've seen over the years, once you get a credit at a certain level, you really fight like crazy not to ever go down below that level. So if you’re once a producer, you don't want to go back to being a co-producer. If you're an executive producer, you want to stay at the executive producer level for writers. Again, writing credits a little more directly tied to a union agreement and specific wages, minimums, residuals, things like that all depend on credit. So credit is very, very important. This new agreement says that AI generated material cannot, if it's given, say the studio. Let's say the studio initially uses ChatGPT to develop a the basis of a script, basis of a story and then gives that to a writer. It can do that. But that AI generated material is not source material is not considered provided by the studio to the writer, so that when the writer then rewrites it puts their own human touch on, it turns it into copyright protected material. The writer gets credit from the start for the whole thing. The entire output is considered literary material from the writer, and that goes into the formula very, very important. Similarly. So let's flip it. That's if the studio gives the AI generated material to the writer. The reverse is also interesting. The studio is not allowed to compel a writer to use AI - to use ChatGPT. You cannot be required to do so. You know, some writers may find that offensive, may not work that way. Who knows? But they cannot be required to do it. But if they want to use it, if they want to use ChatGPT to get started, they can if they get the consent of the studio. In other words, this all has to be disclosed and agreed upon ahead of time and then they also have to put their own human rewrite enough of their own human authorship into this so that it gets copyright protected. The basic copyright rules are still going to apply. The contract here doesn't change those. It just makes it very clear the steps that both the writers and the studios are going to take to ensure that the materials created by the writers are eligible for copyright protection, but also that the writer gets credited for the work even when it started through an AI engine. I think that's really important and powerful goes back to that. You know, we're not trying to replace the human workers and we're not going to use AI to minimize what you would otherwise get for creating this. So the other thing is that the studios have to disclose to a writer. So let's say the studio gives source material or, you know, different props or ideas together not necessarily a preexisting book. And some of those materials are created by AI. The studios are required to disclose that to the author. So there's no dispute about it. And interestingly, then, as I said before, the Writers Guild reserve the right to later challenge any exploitation of the writer's material by a large language model, using it to train the AI. So that case is going to get decided eventually. But assuming that there is any, there's a finding in any way, shape or form that the large language models use of other people's copyrighted information is not fair use and therefore must be licensed, then this permission is going to be very, very important. So I think that's important that the writers reserve that and we're able to do that. So what is what does this mean going forward? Well, I think it's going to be first of all, the writers are back, right? So suddenly. And the first thing we're going to notice is late night, you know, comedy, television, because those shows go on the air and the writers are cranking things out immediately. It's going to take a little bit longer for us to see the impact of this on television and film, because the actors are still on strike, SAG-AFTRA is still on strike, and there are AI implications at issue in that strike as well, more so on the idea of being able to essentially replace the humans, especially extras with AI generated versions. They're, you know, proposals that the extras can all show up one day, get paid for one day shoot, be recorded, filmed and then changed out costumes put in different scenes and used again and again. Obviously, that is very much artificial intelligence replacing a human worker and being used to reduce human wages. So again, assuming that the studios essentially, you know, the same group on one side when you had the actors, when you have the writers, I suspect that there's going to be some type of similar AI protection for the actors. But it's interestingly, because the studios don't have the same need to protect copyright in the actors’ images because those aren't part of the copyright. Those have more to do with actors’ rights to get residuals for things that they appear in. And frankly, simply the ability to even get hired in the first place, as well as the things that we've already seen examples of, you know, deepfake versions of celebrities showing up in very realistic settings, used to sell products, to promote things, to maybe promote ideas that are very contrary to what the celebrity believed in. So that technology's already out there. It's already happened. I suspect that at least in the context of movies and television, that the there will be provisions in those union agreements to at least make this big signatories unable to exploit those in the wrong way. I hope that's the case. I’d frankly, at this point be pretty surprised if there was an analogous language in that agreement when it gets signed. Hopefully soon. But we're seeing the writers back, at least in the places that they can get back to right away. And it'll be interesting to see. I'm sure we'll see, you know, late night jokes, talking about writers and some late night hosts or some daytime hosts see Drew Barrymore having to deal with the fact that they didn't support the writers at the time or didn't appear to support the writers of the time. That's going to be interesting to play out. But those are the types of things that tend to come out after the end of the strike anyway. I'm more interested in seeing, you know, where this leads in the AI world. I want in an earlier episode, when this was all beginning to play out, I suggested that I think where we are headed eventually is some version of a new Digital Millennium Copyright Act, an amendment to add a new provision, or maybe something entirely new, but some sort of legislation that helps establish some standards. You know, going back to the DMCA, when it was created, you know, it part of what it grew out of was the Napster cases, the early music file sharing cases where part of the challenge against those companies was that the argument was that that new platform, its main purpose, its really only valuable purpose, was to permit copyright infringement sharing of files that you didn't have the right to making a digital copy and sending it to others Was copyright infringement that really wasn't in dispute. But the argument was more about the Napster platform in and of itself. And, you know, did it have some independent value, Did it have some independent use? And the argument was that it really didn't. But what ended up coming out of that in the creation of the DMCA was the recognition that platforms like YouTube and now Facebook and others certainly are capable of being used to enable copyright infringement. There's no question lots of copyright infringement happens on YouTube every second, but YouTube has a lot of other purposes, and the DMCA was set up to recognize that as long as the platforms like YouTube had a notice and takedown process and they put the onus on the copyright owner to find the infringement. So of course, that led to the development of all sorts of algorithms and different bots to do just that. It led to a new technology. Once the policy decided that this is your burden here, well then that group says, okay, we need to develop tech so we can spot infringement. And as long as we then notify YouTube's of the world and it comes down, then YouTube can continue to exist. I think we're going to have some similar analysis in underlying these AI models. Do they exist for really primarily simply to enable copyright infringement, or is there some way to satisfy both parties and others allow the tech to exist but build in something like a notice and takedown provision? I don't know if it's going to play out the same way, because part of the problem with using hundreds of thousands of of images or text or whatever to train the model, the there really isn't any way for the individual rights holder to know whether their work was accessed. So that's why you see in the Sarah Silverman lawsuit, we believe because it can produce summaries that are so accurate. But ultimately there's going to be a proof issue there That's going to be a challenge. I don't think it's necessarily exactly the same as in Napster cases, but that underlying thought process, where do we want to place the burden here on the training models, on the rights holders or something else to allow the tech to exist without destroying human productivity and without changing copyright law? It's going to be interesting to watch. And in the meantime, we're going to have contracts that don't wait for the courts. Right. And that's what we have here. We have an industry that needed to get back to work. We have studios that need to protect copyright. And AI doesn't protect copyright. So we kind of had to give on that somehow. The writers needed to make sure that they weren't going to be diminished, reduced or cut out altogether because of this new tool. And they've reached an accommodation that I think is workable. And we'll see. It's going to get ... It's going to continue to evolve. But I think it's a pretty profound first step and I'm happy to see how it turned out. And that's where we are at the moment. We're going to continue to follow this. We're going to continue to follow certainly SAG-AFTRA, hopefully that is not too far down the road. We'll get to dig in and analyze that. But in the meantime, you know, I'm looking forward to something new on my screen. I'm kind of exhausted. I shouldn't say that. It's not like I've watched everything. There's a million things. But you know what's currently on my screen right now? I'm catching up with shows that were made three or four years ago. One, because it's kind of fun to start something that has multiple seasons. Right? And you can kind of I know if I like this, I'm going to have a whole bunch of episodes still to watch. So one we - my wife and I've just started watching is “Physical” on Apple TV. I don't know if anybody's watched it. It's kind of wild. Got a lot going on there, but I think it's very well-written and really interesting. And a lot of it's set back in 1981, early eighties, a time, which I remember. So that's kind of interesting too. But that's what's on my screen right now in terms my pure end of the day energy. I just sit back and enjoy really, really good writing and good acting as well. One last point. There's a lot else in the WGA agreement that's worth talking about. There's some recognition that streaming revenues are different and need to build in some new performance based residuals. When a streaming show really hits it, the writers should benefit, and that now exists in a way that it didn't before. There are some other provisions in here too. I think it's fair to call this a victory for the WGA on balance and a good step forward and hopefully we'll see the same with SAG-AFTRA before too long and it'll be really interesting to watch whether this has impacts in other industries as well. So if you enjoy what you're hearing today, if enjoy this podcast, find and follow us wherever you get your podcasts. And if you're watching on our YouTube channel at TheScreenLawyer.com, hit that like and subscribe button so you'll be sure to get every episode of The Screen Lawyer Podcast. Take care.