The perils of artificial intelligence to the entertainment industry came to San Diego Comic-Con on Saturday, with SAG-AFTRA national executive director Duncan Crabtree-Ireland joining a panel of voice actors organized by NAVA, the National Association of Voice Actors, to discuss the specific hazards AI is already posing to the profession.
“We’ve got to reject the idea that this is just something that’s going to happen to us and we can’t say anything about it,” Crabtree-Ireland said at the outset of the panel, about whether AI could devastate the entertainment industry. “I think it definitely could, the question is whether we’re going to let that happen.”
Along with Crabtree-Ireland and moderator and NAVA board member Linsay Rousseau (“God of War: Ragnarok”), the panel, which played to a standing-room only audience, consisted of Ashly Burch (“Mythic Quest”), Cissy Jones (“The Owl House”), SAG-AFTRA negotiating committee member Zeke Alton (“Call of Duty”) and NAVA president and founder Tim Friedlander (“Record of Ragnarok”).
At issue for the panel was the growing certainty that without explicit contractual and statutory protections in place, AI could not only effectively replace the vast majority of work for voice actors, but manipulate their voices to create content without their expressed consent.
“As a human voice actor, I can walk into a room and get a script that says something that I didn’t either agree to say or something that I would never say, I personally have that ability to walk out of that room,” Friedlander said. With AI cloning the voices of actors, however, “We’ve lost control over what our voice could possibly say,” he said.
Without referring to the movie by name, Crabtree-Ireland likened the issue to “a story of a small mermaid and sea witch that literally steals that mermaid’s voice.” As the audience laughed in recognition of Disney’s “The Little Mermaid,” Crabtree-Ireland continued, “I remember seeing that for the first time and thinking how horrifying is it that this sea witch steals the voice of this person and then uses it for whatever. That is exactly what we’re talking about.”
“As an actor, you need to know what they’re going to do with some digital version of you that they’re creating using AI, not in general,” Crabtree-Ireland continued.
Jones said that, because “there’s no stuffing this genie back in the bottle,” for the past 18 months, NAVA has been creating “a framework to work within AI in an ethical manner.” The main points are that voice actors need to give active, informed consent for their voices to be used by AI (in contrast to what she called far more expansive “passive consent” that is buried in contractual language); that voice actors need to have control over how their voices are used; and that they need to be fairly compensated for that use.
Jones added that she’s in the early stages of building a company that would provide “the first actor-first, ethical use of AI with voice over.”
The sense of urgency to find a resolution was brought up several times during the panel.
“With the pace of this technology, by the time we figure out the abuses that we know are going to occur, it’s too late,” Alton said. One of the greatest concerns voiced was how AI would specifically foreclose working opportunities for up-and-coming voice actors trying to get their start in the field with smaller roles in animation and video games — roles that AI could easily replace. The panelists also said it’s common for contracts to include broad language allowing studios to own the actor’s work “in perpetuity” and for use “in any technology currently existing or to be developed.”
For actors just starting out, that kind of contract language could mean “your first job could potentially be your last job if you don’t have protection,” said Friedlander.
Added Burch, “I don’t want the next generation of voice actors to not have the potential to build a life that we’ve been able to build in this business.”
To that end, Friedlander said that NAVA is working with the European Union to get voice protection written into the A.I. Act, which is currently working through the European Parliament. He also said that NAVA has been working with sites that host video game modders — who use coding to modify the appearance of game avatars, sometimes to resemble real people and real voices — to take down voice mods that have been obtained without the actor’s consent.
“In the last three months, we’ve gotten possibly 6,000 to 7,000 audio files removed from certain websites,” he said.
Several panelists also emphasized the potentially existential threat AI poses to all kinds of workers well beyond the entertainment industry.
“Artists, coders, journalists, lawyers, this kind of technology is going to touch every single form of labor in the world,” Burch said.
Added Crabtree-Ireland: “All of us could stand up to the abusive use of technology, to really say, ‘We’re going to say what can be done with our bodies, our voices, our faces, our likenesses’ — and we must do that.”