Studios Want to Own Actor Likenesses Forever. SAG-AFTRA Is Fighting to Stop It.

Abhishek Gautam··7 min read

Quick summary

AI can now clone any actor's face, voice, and performance. Studios are negotiating to own those clones as permanent intellectual property. SAG-AFTRA is pushing back with AB 1836, the Tilly Tax, and CAAVault. Here is what the 2026 negotiations actually mean.

In 2026, a studio can take a single day of filmed footage of an actor and use AI to generate that actor in any scene, any age, any language, indefinitely. The voice can be cloned from an audiobook recording. The face can be transposed onto a body double. The performance can be adjusted in post-production without the actor being present or informed.

The technology exists and is being used. The legal fight over who owns it — and who profits from it — is happening right now.

What SAG-AFTRA Agreed to in 2023 — and What Studios Want Now

When SAG-AFTRA ended its 2023 strike with a new contract, AI protections were central to the agreement. The baseline rules:

Studios cannot create or reuse a digital replica of a performer without explicit, informed consent. Consent must specify the project and use case. A studio cannot repurpose a digital clone created for one film for a future unrelated project without renegotiating with the performer. Name, image, and likeness (NIL) rights remain with the performer — not the studio.

This framework has held for two years. The 2026 negotiations are where studios are pushing to expand beyond it.

What the studios want in 2026: the ability to create a digital likeness once — with consent and compensation — and then own that likeness as intellectual property that can be licensed, sublicensed, and used indefinitely across future productions without recurring fees.

In plain terms: pay an actor once, own their face forever.

What California Law Already Prohibits

California AB 1836, signed into law in September 2024, provides baseline protections for deceased performers. It bans the commercial use of digital replicas of dead actors, musicians, or athletes in film, television, or video games without consent from the estate.

Estates retain control for 70 years after the performer's death. Violations carry penalties of $10,000 or actual damages, whichever is greater.

This law was passed specifically because studios had already begun using AI to resurrect deceased performers without consent. The most prominent examples included voice clones derived from archival recordings and face swaps used in film trailers.

Living performers are not covered by AB 1836. Their protections come from the SAG-AFTRA contract, their individual agreements, and the general right of publicity that exists in California and most US states.

The Tilly Tax: SAG-AFTRA's Counter-Proposal

SAG-AFTRA has proposed what negotiators are calling the "Tilly Tax" — a royalty that studios must pay the guild any time they use an AI-generated actor that is not a real human performer.

The proposal has two goals. First, it creates a financial disincentive for studios to replace human actors with synthetic performers wholesale. Second, it creates a revenue stream that flows back to the guild and is distributed to working performers, including those whose livelihoods are most threatened by AI replacement.

The studios are resisting. Their argument is that AI-generated synthetic performers — entirely fabricated characters with no real-world counterpart — are not analogous to digital replicas of real actors and should not carry union fees.

What CAA Is Doing: CAAVault

The Creative Artists Agency, one of Hollywood's most powerful talent agencies, has launched a service called CAAVault. It inverts the studio-ownership model entirely.

CAAVault lets living actors — and sports figures represented by CAA — create their own official digital clone. The clone is created, owned, and controlled by the talent, not the studio. CAA manages licensing on the talent's behalf. Studios that want to use the clone pay a licensing fee that goes to the talent, who retains full veto rights over use cases they find objectionable.

This is the actor-as-IP model rather than the studio-owns-IP model. Its viability depends on whether A-list talent has enough leverage to insist on these terms. For the top tier, they almost certainly do. For the middle tier of working actors — television character actors, voice actors, stunt performers — the leverage is far weaker.

The Voice Industry: Already in Motion

While the film industry negotiates, the voice cloning market has moved faster. ElevenLabs has licensed the voices of Burt Reynolds, Judy Garland, James Dean, and Laurence Olivier from their estates. Matthew McConaughey partnered with ElevenLabs to clone his voice and invested in the company.

These are consensual deals with clear ownership structures. The danger is not the top-tier deals but what happens at the bottom of the market — voice actors doing video game work, audiobooks, and commercial narration, who have far less negotiating power and whose voices are easier to clone without detection.

What This Means for the Industry

The 2026 SAG-AFTRA negotiations will likely produce a framework that protects A-list talent effectively while leaving middle-tier performers exposed. This is the historical pattern in entertainment labour disputes: the most prominent performers get contractual protections, and the rest get whatever trickles down.

The deeper issue is that AI likeness cloning fundamentally changes the economics of casting. Once a studio can clone a well-known face and voice, the marginal cost of using that talent in additional productions approaches zero. The initial payment becomes a one-time licensing fee for unlimited future use.

For developers building tools in this space — AI-generated video, voice synthesis, digital human platforms — the regulatory environment is hardening. California AB 1836 will likely be expanded to cover living performers. SAG-AFTRA protections will become more explicit in contracts. Any product that enables studio-scale likeness cloning without consent verification will face legal exposure.

The technology is not going to slow down. The legal framework is going to have to catch up to it.

Free Weekly Briefing

The AI & Dev Briefing

One honest email a week — what actually matters in AI and software engineering. No noise, no sponsored content. Read by developers across 30+ countries.

No spam. Unsubscribe anytime.

Free Tool

Will AI replace your job?

4 questions. Get a personalised developer risk score based on your stack, role, and what you actually build day to day.

Check Your AI Risk Score →
ShareX / TwitterLinkedIn

Written by

Abhishek Gautam

Full Stack Developer & Software Engineer based in Delhi, India. Building web applications and SaaS products with React, Next.js, Node.js, and TypeScript. 8+ projects deployed across 7+ countries.