AI concerns stall contract negotiations between game companies and actors
Jul 8, 2024

AI concerns stall contract negotiations between game companies and actors

HTML EMBED:
COPY
Sarah Parvini of AP explains how AI cloning and consent are at the heart of the talks between video game producers and SAG-AFTRA.

Big-budget video game producers and the Screen Actors Guild-American Federation of Television and Radio Artists, the union representing voice-over actors and motion-capture stunt workers, have been negotiating a new labor contract since last September. And union leaders say those talks have stalled due to concerns over generative artificial intelligence.

(Note: Several Marketplace employees are also represented by SAG-AFTRA under a different contract.)

Marketplace’s Lily Jamali spoke with Associated Press reporter Sarah Parvini, who recently wrote about the negotiations. She explained how consent is a key concern.

The following is an edited transcript of their conversation

Sarah Parvini: What voice actors as well as motion-capture artists and stunt performers who fall under this interactive contract are worried about is this idea that without consent, a previous performance can be used to feed generative AI, for example, and your voice or your likeness can be then replicated without you knowing about it. And chief among the concerns, if that happens, is your voice can be used not only to say something that you might not morally agree with, but also that you are competing against yourself. You know, if I go and do a recording of, let’s say, a side character in a game, and then later on they want to add a few more lines to that character, if I am not performing with informed consent and that studio, let’s say, is using generative AI, they could use my first recording that I did to then create subsequent recordings without having to pay me.

Lily Jamali: That’s really interesting. So the other thing I noted in your article was you write that union leaders don’t consider themselves “anti-AI.” There’s more nuance there, it sounds like?

Parvini: Absolutely, it’s not the idea of [closing] Pandora’s box, if you will. Everyone knows that AI is going to be used, and there are positive ways to use it, according to a lot of the folks who are working in the industry. If there’s some background noise that you want to get rid of that you didn’t want there in the first place, you can use AI to clean that up. If you’re OK with your voice being used in an AI type of situation, and you want to enter a contract like that where you get paid for the reuse of that likeness, for example, it sounds like they’re fine with that. But it really just comes down to: Are you being asked for consent before that’s done?

Jamali: So what do the video game developers have to say about all of this? What’s their position?

Parvini: Well, the spokesperson for some of these very, very big companies has said, really continuously each time I’ve talked to them, that they are negotiating in good faith. They say that they have made good progress in other parts of the negotiations, and that they are essentially confident that they will be able to come to a deal.

Jamali: Do they seem open to this argument about informed consent, which you just laid out for us?

Parvini: According to the folks who are in the negotiating room on the SAG-AFTRA side, no. It sounds like they are willing to perhaps provide some of these protections for voice actors, but not the other people who are protected under this contract — so the stunt workers, the performance capture artists. And SAG-AFTRA’s position on that is, “Well, we can’t just protect some people and not others.”

Jamali: So AI, of course — and a lot of us remember this — was a very contentious issue when actors and writers were negotiating with studios last year. How much of an issue has AI been for the voice-acting industry so far?

Parvini: So far, I would say it’s not necessarily at a place where it’s, you know, every voice actor is being replaced by generative AI, right? It really is a type of situation where you are seeing it pop up, and these conversations are happening in the industry itself. For example, there are voice actors who have seen essentially a generated version of their voice used in fan mods — so these types of fan creations of a game that alter the game a little bit, but they can be used in an explicit way, for lack of a better way of putting it. You know, people’s voices are being used perhaps in sexual situations that they would never want to voice themselves, and so it feels to them like a violation. And I think another concern is that for folks who are just starting out as voice actors, this could be particularly damaging if it becomes the norm. So the idea that your first job could be your last job, because they can just can that recording and then feed a system to reproduce your voice is a very real one among folks who not only are just starting out, but who have been around for a while, and therefore worry for those who are just starting out.

Jamali: You also spoke to at least one actor for this article, and I was just curious if you could talk a little bit about what they told you when it comes to just the quality of the craft as an art. What do they have to say about how AI might alter that?

Parvini: Yeah, absolutely. At the heart of it, I would say for many performers, is that this is an art, just as acting for film and TV is an art. And so when it comes to bringing a character to life, whether that is a character who only has two lines, or a character who is the protagonist of a story, it comes with coloring it with your lived experience, coloring it with choices that you make on where is my inflection point in this sentence, and what is my goal or my motivation as this character? And so in speaking with some of these voice actors, I have heard sort of a common refrain of, well, this is an art, and a computer cannot replicate art in this type of way. It will be emotionally shallow because AI doesn’t feel, and so when you are mining your own experiences and mining your own feelings to bring a performance to life, to bring a character to life, it will inherently have more depth. At least, that’s the argument that they are making.

More on this

Sarah mentioned the recent use of generative AI in “fan mods” — software created by fans to alter or enhance how a video game functions.

Last year, fans of the popular game Skyrim released mods that used AI-generated voice clones of the game’s actors, some of which were explicitly pornographic.

Many of the actors involved demanded the mods be removed after they learned their voices had been modified in that way without their consent.

As we mentioned in our interview, union leaders say they’re not “anti-AI,” according to Sarah’s reporting. SAG-AFTRA reached a deal earlier this year with one AI voice technology company, Replica Studios, which allows studios and union actors to to create or work on digital replicas through Replica.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer