How AI Will Ruin Your Favorite Show
Hollywood studios want to use AI to steal actors’ faces and writers’ scripts. If they win this strike, it’ll ruin what you watch — and destroy tens of thousands of jobs.
By Jordan Zakarin, More Perfect Union
If the WGA and SAG-AFTRA strikes were simply about how to more fairly split up the billions of dollars generated by the entertainment industry each year, there’s a chance that the unions and studios would have reached an agreement weeks ago. Profits, even under Hollywood accounting, are solid and tangible; negotiations over money may turn hostile, but they always end with some certainty for each side.
For the unions and studios, it’s an uncertain future that has helped to keep production in stasis. The work stoppages continue to rumble on — 135 days for the writers, 62 for the actors — because what’s at stake is the very future of the industry and the danger that the next strike will be broken by artificial intelligence capable of producing cheap simulacra stolen from the artists on the picket line.
We looked at how AI could soon be used in the creative process and the consequences of its adoption through the lens of Law & Order, the long-running legal and police procedural franchise.
Watch, or read more below:
Produced by Dick Wolf, the shows in the series are all miniature versions of an entire production ecosystem, from casting and location scouting to writing and acting. It’s also produced well over 1000 hours of episodic, tightly formatted television since it premiered 34 years ago, and as former Law & Order: SVU and Criminal Intent showrunner Warren Leight told us, that makes it a prime target for an AI facsimile.
“I have to imagine somebody is thinking, ‘There's 500 episodes of SVU, how hard would it be these days with computing power where it is to feed those scripts in and understand, oh, at the end of the first act, we need a turnaround at the end?’” Leight said. “How hard would it be to get what would be a mediocre script and then to give it to the showrunner?”
Programs like ChatGPT are already able to generate short scripts, and more advanced programs can produce longer pieces with more surface-level polish. What they cannot do, though, is include the human element that is at the center of art and productions that people have watched for more than a century. The life experiences, the changing cultural mores, the unexpected decisions made by creative people.
“Sometimes when you've had a trauma, the circumstances around it change your behavior even in a seemingly arbitrary way,” Leight said. “But a computer's not gonna dig that out of their lived experience. They're not gonna dig something outta their traumatic childhood. They're not gonna dig something out of the breakup they went through. They're not going to dig a betrayal that occurred to them, that this character needs to go through in order to understand.”
Producers are already scanning actors’ faces and bodies for digital enhancement and reproduction, putting their likeness rights at risk. Without regulation, a producer could do whatever they want with somebody’s body or face and pay them absolutely nothing for the privilege. It’s terrifying for not only lead actors with speaking parts, but also background actors who have already begun to be replaced in crowds by digital figures.
The ripple effect of all this AI work would be enormous, impacting almost everyone working on a production.
“When you think of a day's work, it could take away your pay, but then it also takes away the pay of the costumer, of the makeup artist, of the lighting technician,” says Allison Siko, an actress who has played the recurring role of Kathleen Stabler on SVU and Organized Crime. “Because if everything can be done in the computer with say, one animator, or maybe you have another editor and a few other people, you're taking the jobs of at least 20 to 50 people and completely cutting them out of the equation.”
It took Hollywood studios by surprise when the WGA included an expansive regulation of generative artificial intelligence in its contract proposal earlier this year. The executives shouldn’t have been quite so shocked: The technology is in its nascent stages, but evolving quickly, and both the writers and actors are reminded daily about the cost of not anticipating the development of new technology and how studios could use it to increase their share of the profits.
During the 2007 writers’ strike, streaming was in its infancy, with little impact on the distribution of movies and TV shows. The production contracts and residuals for streaming are now so minuscule that even returning to cable rates has been a challenge. Both unions are determined to avoid that fate with AI, and are thus seeking firm rules guiding how the technology will be used in the creative process as it comes of age.
Thus far, the studios, through the AMPTP, have offered to always credit writers and not use AI as a way to reduce their pay for work that they’ve done. This would give the technology a backdoor into being able to produce copyrightable material, which courts have ruled is not possible at the moment. There are disputes over what exactly the AMPTP is willing to concede with actors, who are arguing for the right to provide informed consent (and receive royalties) when studios want to use their images.