A small woodland-creature movie called Critterz is aiming to premiere at Cannes later this month. It was made in about nine months on a budget under $30 million, with OpenAI's tools woven into the production pipeline. If it lands the festival slot, it will be the first AI-assisted animated feature to play Cannes.
It's worth taking seriously for one reason. The production isn't a tech demo. It's a Vertigo Films and Native Foreign project, written by James Lamont and Jon Foster (Paddington in Peru), with human voice actors and human artists driving the shot list. AI is used as production infrastructure, not as the show.
What's actually new here
Animated features have a stable economic shape: roughly $100–200 million budgets and three-year schedules. Critterz, at the numbers its producers have stated, sits at about a third of the time and a fifth of the budget of a comparable studio film. That gap is the whole story.
The savings don't come from replacing animators with prompts. They come from compressing the parts of a pipeline where the same shot gets touched dozens of times: paint-overs, continuity passes, look development, variant exploration. Native Foreign's claim is that AI shortens those passes from weeks to hours, while a human team owns the storyboard, the performance direction, and the final cut.
That's a different claim than "AI made the movie." It's closer to "AI ate the busywork inside the movie." The first is a press release. The second is what the rest of the industry will copy if it works.
The workflow, as far as we can tell
The public description looks like a human-first pipeline with three AI insertion points. Artists draw character and environment sketches by hand, which are fed into OpenAI's image and video models for look exploration and shot synthesis. Once a look is locked, AI generates motion studies and shot variants from those references, with human supervisors picking and revising. The slowest part of any animation pipeline is keeping a character on-model across thousands of shots, and AI is used to push continuity fixes through the sequence in batches.
There's a useful detail buried in the project history. Native Foreign released "Critterz: Remastered" earlier in 2026 using OpenAI's Sora, before Sora was discontinued in March. That short was effectively a workflow proof. The studio was building the pipeline before the feature went into production. The lesson isn't about Sora specifically. It's that anyone trying this needs a model-agnostic pipeline, because the model under the hood will change two or three times before any feature ships.
Why it matters even if Critterz is mediocre
There's a real chance the film is fine and not great. First-of-kind productions usually are. That doesn't change what the project demonstrates.
A nine-month, sub-$30M animated feature with name screenwriters and a real festival path is the case study every smaller studio has been waiting for. If it screens at Cannes, two things follow within twelve months. Every regional animation house with a script and no $150M will start pitching its own version of the pipeline. And every major studio will quietly run the same math on a back-catalog project and decide whether to greenlight an AI-assisted feature internally.
Critterz moves the conversation from "can a short look good" to "can a feature ship on time, on budget, and not fall apart in the third act." Those are the only questions distributors care about.
What to watch when it screens
Three things will tell you whether the workflow is real or marketing. First, character consistency across the runtime. A two-minute short can fudge it. Ninety minutes cannot. Second, action and physics in the back half. The hardest shots in any animated film come in the climax, and that's where AI pipelines tend to break. Third, per-shot variance in style. AI pipelines leak: a different texture here, a different lighting model there. Watch for shots that feel like they belong to a different movie.
If the film clears all three, the production model is real and copyable. If it doesn't, the pipeline is still useful, just not yet for features.
What to do about it
If you're working on anything longer than a short, watch the Cannes coverage carefully and ignore the trailer. Trailers are cherry-picked. Screening reactions, especially from animators, will tell you what the workflow can actually deliver. The right takeaway for a small studio isn't "use these specific tools." It's that a human-led pipeline with AI as production infrastructure, the same shape behind Promvie's orchestration, is now a defensible way to make a feature, and you don't need a Disney-sized team to attempt one.
Critterz is a test. The interesting part isn't whether OpenAI ships a movie. It's whether the rest of the industry decides to copy the pipeline by autumn.