The Oscars Draw a Human Authorship Line Around AI

May 3, 2026

A film award statue in dramatic lighting, representing the Academy's new rules on human authorship and AI in film.
The Academy has approved rules for the 99th Oscars that require human performance and human-authored writing — without banning AI tools outright.

The Academy of Motion Picture Arts and Sciences has approved its rules and campaign regulations for the 99th Oscars, and buried in the technical language is a line that creative industries have been waiting for someone to draw: acting nominations require roles demonstrably performed by humans with their consent, and eligible screenplays must be human-authored.

The rules stop well short of banning AI. They do not say a film cannot use AI-generated visual effects, AI-assisted editing, or AI tools in production. What they say, more precisely, is that the specific categories that recognize human craft — performance and original writing — now require evidence that a human actually did that work.

What the rules actually say

For acting categories, the Academy's new rules establish that a nominated performance must be demonstrably the work of a human performer who consented to that use of their likeness and labor. That matters because the practical threat was never robots winning awards — it was studios submitting performances reconstructed from archival footage, trained on a deceased actor's catalog, or synthesized wholesale from a digital model, without meaningful human creative input in the performance itself.

For screenplay categories, eligible scripts must be human-authored. The Academy has not specified a hard percentage or audit process in the published rules, but it has reserved the right to ask for information about AI use and the degree of human contribution. That soft power — the ability to ask — is itself a compliance mechanism, because the cost of a disputed nomination is reputational as well as procedural.

The Academy is also giving itself the authority to inquire about AI use across submitted work more broadly, not just in these categories. That positions the organization less as a technology regulator and more as a transparency enforcer: disclose what you used, demonstrate where the human contribution lies, or face scrutiny.

Why this matters beyond Hollywood

The film industry does not set software policy, but it does set cultural precedent. When the Oscars define what counts as a human performance, they are doing something that most industries have been unable to do cleanly: establishing that authorship and consent are separate from the question of whether AI tools were involved.

That distinction is important. The rules do not say AI assistance disqualifies a work. They say AI substitution in the credited creative act does. A screenwriter using AI to check structure, punch up dialogue, or explore alternate scenes is still the author of the screenplay. A studio submitting a fully generated script with a human name in the credits is not.

This is the line that creative professionals have been arguing for, and it maps cleanly onto questions that builders of AI tools are going to face in every professional domain: not whether AI was used, but who the work ultimately belongs to, whether the people whose creative material trained the system gave meaningful consent, and how you verify any of this after the fact.

The product signal: provenance, consent, and disclosure

For builders, the Academy's framework points toward a cluster of features that are going to matter more as AI tools become embedded in creative and professional work:

None of this requires tools to become less capable. It requires them to be more legible. The Academy is drawing a line that other credentialing bodies, guilds, courts, and employers are watching. The first products that make human authorship easy to document alongside AI assistance will have a durable advantage as that line spreads.

For SunMarc App Labs, the practical takeaway is simple: as we build utilities, creative tools, and future AI-assisted products, the interface should not only help people produce work. It should help them understand what was human-made, what was AI-assisted, and what can be trusted later.

Relevant links

← Back to updates