Pexote Jack: Ohio! - REVIEW & INTERVIEW
Pexote Jack: Ohio!
HFF: Why this choice to make a film with the help of AI?
HD: Technology gave birth to Cinema. From the first moving image, to sound, to color, to CGI, and now AI - all of these technical leaps have created new possibilities for filmmakers to tell new types of stories.
With the release of Veo 3 earlier this year, I felt that generative moving imagery as a technical innovation had reached an inflection point where it could finally be used to create hollywood level imagery at a fraction of the budget.
But what excited me most was pondering about the types of imagery AI could create that simply couldn’t have been created before. Whether impractical from budgetary standpoints, to the “we don’t know what we don’t yet know” types of imagery - things we can’t even dream of yet.
And so rather than lashing out at the idea of “AI” as many creatives have, I wanted to instead learn to work with this new medium in the form of a short film to fully understand the consequences of this new technology, as well as the possibilities.
HFF: Have you already completed similar projects before?
HD: I began my career on screen as a writer, and have always loved the ability of CGI to help you tell a story that is “familiar, yet different.” I then began learning CGI myself to help bring my scripts to life, first with Cinema 4D then eventually Unreal Engine.
When the generative imagery first became consumer facing, I was both excited and slightly terrified at the implications, but began tinkering with the possibilities in early 2022 on TikTok.
Instead of making what we now call “slop,” I’ve always been keen on using these tools to tell a story, and at the time, TikTok was a good outlet to do so in bite sized sketches that the tech of the time could handle. It’s funny though, because all of those filmic experiments look like ancient relics now, just 3 years later.
Anyway, having directed professional 2D el animation before without any AI, I thought it would be a good litmus test to make a 100% AI film that nobody could tell was made with AI.
HFF: How to work with a film made with AI? The script, the settings, the colors, the voice dubbing. Tell us more.
HD: The most challenging part of working with AI is that it is essentially mediumless. But it’s also what makes it so exciting. To explain:
If traditional filmmaking is akin to Newtonian Physics, then generative AI filmmaking is akin to Quantum Physics. There are just so many similarities.
The first similarity is that with generative AI, there is no physical act of creation. There is no blocking, no gaffing; not even the physical act of moving a Waycom pen in Toon Boom or your mouse in Unreal.
Like Quantum Physics, everything is a thought experiment. You can’t physically will anything into existence. You just have to patiently iterate until you get to where you want to be.
Because of this unique aspect, this new process is very spontaneous. Much like the “multiple world” of quantum physics, the first frame can go in an infinite number of directions, even with the “quantum entanglement” of the last frame approach.
So when making a film intended to look like 2D Cel art, a ton of thought was given to specific mediums. Ensuring that the backgrounds looked like they were made with watercolor and gouache was a massive challenge. Thousands of images were created, and multiple composites made to achieve the shadow directions, reflections, etc.
Then, the characters were “animated” with Google Veo on chroma key backgrounds, then composited over the backgrounds. For more of the specifics, I go into detail with each of the different photos below.
But I will say, that of all the parts where AI is the most helpful, the script - or story - is still the most important aspect, to ensure that all of these spectacles work towards luring the viewer in, to let them suspend their disbelief, and then ultimately deliver catharsis.
HFF: Tell us about this character of Jack, how was he born?
HD: While hiking Joshua Tree once years ago, I saw a jackrabbit, and got this idea for a southwestern Alice in Wonderland, where the rabbit is a jackalope instead. And instead of Alice falling down the rabbit hole, maybe the Jackalope eats pexote and falls into a wormhole of his own consciousness.
I played with this idea with a 3D short I made in Unreal in early 2024, and although I loved the cactus character, it felt kinda limited.
Then, when exploring the idea of making a film with AI, I was thinking about the tendency for AI to "hallucinate," and then landed on the idea of Jack being public domain, since he was a rubber hose animation anyway.
All of these themes are explored in the subsequent writing I’ve done so far. Whether it’s released as a series and this short is the pilot, or it’ll all become a feature film, is still up in the air.
HFF: How long did it take you to make this film?
HD: 4 months. I started on June 1st 2025, and after working on it all day every day, finished at the end of September.
To be honest, so much of that time was spent learning this new process, like Comfy UI, open source models, etc.
I imagine the next episode would take me probably 3 weeks, now, since I have the character design, background styles, etc.
The worldbuilding is a huge part of the process.
HFF: There is a kind of mise en abyme with Jack exploited by AI in a movie you made with AI, what message do you want to convey?
HD: Exactly! I’m so glad you picked up on that.
Like many filmmakers, my biggest fear with AI is that it will eliminate the need for a creator.
When the consumption determines the creation, there is great danger of a recursive loop forming, where what is watched, determines what is made.
The algorithm creating the algorithm, in a sense.
Expressing this fear from the standpoint of a “public domain” character with no legal rights on one side and a technocrat on the other spouting an absurdist blend of “social media strategist buzzwords” and “gen Z slang” felt like a good way to have kinetic humor reveal the potential horror.
4 - "Background art was generated with midjourney as photo realistic images, then using them as control nets, they were style transfered into 2D cel style animation backgrounds (with many, many iterations to get to final pixel backgrounds.)"
HFF: How do you imagine the future of animated cinema?
HD: My greatest hope is that AI ushers in an “era of the Auteur” for all forms of Cinema.
HFF: Are you nostalgic for the traditional animation cinema of the 1930s?
HD: Of course! There was something kinda dark about early rubber hose animation. It was all for adults initially, and so it explored darker themes specific to its era.
Underneath the slapstick, those cartoons often reflected the spirit of the Great Depression, and expressed the fantasy of bending reality, when life itself was inflexible.
And so in this short, Pexote Jack finds himself staring in the face of a new iteration of the Industrial Age, where workers aren’t just exploited, but completely discarded.
HFF: Do you think that AI is a simple tool at the service of humans?
HD: “Everything has been figured out, except how to live.”
HFF: Have you participated in public screenings with your film? Do you plan to release it?
HD: Pexote Jack’s World Premiere will be on Nov 2 in Tokyo in competition for best AI Animated Film at the AI Filmfest Japan. I’m beyond excited about it, and can’t wait to attend!
After it premiers in Tokyo, I’ll decide when to release it at large.
HFF: Do you have other projects in sight?
HD: I’m currently working on a short that blends Unreal Engine with AI, which I hope to announce soon.
HFF: Do you plan to work only with the AI?
HD: No. I can’t wait to collaborate with other talented humans again.
Thank you Hoyt!
Peyote Jack: Ohio! is in competition at the 20th edition of Hallucinea Film Festival. Results - 2025, October 29.
By Hallucinea Film Festival