Are video games changing the way we produce films, and how is AI impacting this development?
- rebeccaf73
- Apr 14
- 6 min read
In discussion with Ashley Keeler, Head of StudioT3D and CTO, Target3D.

This is a big topic but first, let’s track back and look at some history!
We all recognise the iconic video game names, and it’s fascinating to compare that was then with this is now. If you examine the origins of pixel art games, which we still have a nostalgic soft spot for, and then look at how we moved from 2D sprites in Pong or Donkey Kong to The Last of Us or Red Dead Redemption, you realise the progress is mind-blowing.
The leaps the gaming industry has made over the last 40 to 50 years are incredible - a significant part of that realism and advancement has come from motion capture.
Motion capture has introduced a level of realism in in-game animations and cinematics - allowing players to have meaningful interactions with characters on a more emotional level. The technology that was once exclusive to high-end film studios has now been integrated seamlessly into game production, helping developers create lifelike characters that respond in realistic ways.

I was a huge Red Dead Redemption 2 fan. I absolutely devoured that game in 2018 (or whenever it was). I think I completed it in six weeks! It was one of the first times I actually became totally obsessed with a game.
Why? Because I developed an emotional connection with the protagonist - that game really pulls at your heartstrings. [#arthurmorgan] The integration of advanced, sophisticated facial expression rigs and the realism in animation allows for deep emotional engagement with in-game characters.
This ties into the idea that in-game assets are now essentially film-ready. The assets we develop for games today can be used on LED stages for live-action shooting, or they can be repurposed for film, TV, and even advertising.
For example, you could have a Red Dead Redemption 2 character appear in a cross channel marketing campaign. The resolution, rigging, and overall quality of game assets today are as good in a PS5 console as they are in a film studio for the next Marvel blockbuster.
I think we’ll start seeing a blend of reality between virtual characters, players in games, and real-world content. In some cases, it’s already interchangeable.
I was on set for an advertising shoot recently for an upcoming AAA game that was using Unreal Engine assets and environments on an LED wall with actors in the foreground to quite literally blend between the virtual and physical worlds.
Remember back to when Unreal Engine released that public demo showcasing Keanu Reeves for The Matrix—they created a digital Keanu versus the real Keanu. Could you tell which was which? [Spoiler, you could]... but this ability to switch between virtual and physical and then share assets across marketing campaigns, in-game content, and traditional content production - that is really game changing.

When characters in a game build emotional connections—whether through trust, love, or affection—that engagement can extend beyond the game itself. It can translate to a brand identity and the subsequent marketing.
This idea that audiences will form deep emotional bonds with digital characters is something that’s likely to reshape the entertainment industry over the next few years and as we move through the final % towards true photorealism I can only anticipate those emotional connections growing stronger.
Tailored Gaming Experiences
That’s in one direction, the content creators allowing us, the audience, to build stronger relationships with characters that exist across games, films, platforms and formats.
But what about user feedback; how do we further evolve that engagement?
Now, we're starting to get to the point where we can gather engagement metrics. Those metrics can start to adapt games to make them more or less tailored to the user. Sometimes, in quite a scary way, we can try to lure people in the way that casinos used to—get rid of those pesky windows and clocks and keep you playing at all costs.
There’s wearables that can provide biometric feedback from the user to tailor the game experience accordingly. For instance, the game might want to keep you in a particular state of hyper-excitement or fear whilst haptic feedback suits (like Teslasuit demo’d in Half Life Alyx), let you feel the gunfire and explosions (from the safety of your lounge).
We’re beginning to receive that kind of nuanced data from the player, allowing the game to adjust dynamically—making it less scary, more scary, slowing the pace, or speeding it up. I can see a world where this user specific tailoring coupled with genAI for NPC interactions gets us close to Charlie Brooker’s 2016 Black Mirror episode "Playtest".
GenAI, and game iterations. A dev cycle approach that’s caused rage in gamers in the past.
I’m thinking of development cycles like Cyberpunk 2077 or No Man’s Sky, where they’ve tried to get an MVP (minimum viable product) out the door and then develop it over the years. That approach has often been criticised because players expect something epic right from the start.
But if we consider long-tailed games, Grand Theft Auto V is a great example—it feels like it’s been around forever and the online world and mods are ever expanding.
Now, we’re at a point where games are no longer just games: they are evolving worlds. Through the use of generative AI and feedback loops, they can literally adapt and develop over time.

No Man’s Sky was always intended to be a game that would evolve and become more intricate. There was a shift in perception, and now we’re in a time where people are starting to embrace how games can grow, evolve, and adapt. And with Grand Theft Auto VI on the horizon, we’re seeing these worlds becoming more photorealistic and dynamic than ever.
Right now, these technologies aren’t just for games—they’re used in virtual production filmmaking and traditional VFX pipelines. It’s becoming common to see films made inside game engines. Productions like Love, Death & Robots showcase how entire narratives can be created within a game.
At this point, the line between game engines, animation, and film production is almost nonexistent.
It reminds me of the emergence of Machinima - there was a trend of using video games to create films. I’m thinking of Rooster Teeth’s Red vs. Blue, which used games from the Halo series.
Creative and ingenious individuals realised they could use screen captures from games to cut together their own stories. That approach is only growing, especially for indie filmmakers. You could definitely location-scout inside Red Dead Redemption. You can literally explore the game as a photographer or videographer, rendering viewpoints and building a whole library of plates to use on a virtual production wall. Suddenly, your film is set in the Midwest.
AI Going Forward

AI in games is advancing rapidly. We’re seeing procedural generation used to create new landscapes and personalise gameplay. Instead of selecting from pre-written responses, AI-driven interactions can allow you to have meaningful, dynamic conversations with in-game characters—exactly like talking to an AI assistant. That custom horse name you’ve selected? That’s now being used by the NPCs rather than catch all dialogue.
AI can also be used for in-game behaviours, making experiences feel more personal to each player. For example, the game might adapt to focus on a particular part of the gameplay that you’re especially engaged with. Procedural generation can create environments on the fly, ensuring no two players have the exact same experience.
We’re also seeing AI used in animation and motion capture. The amazing performances of mocap and voice actors can now be turned into libraries of expressions and dialogue, allowing for more natural and responsive character interactions. Imagine meeting your favourite in-game character and having an effortless, free-flowing conversation with them.
This is incredibly exciting, though it also raises ethical concerns about the impact on actors and creatives. Protecting their contributions is crucial. If I could engage in a free-flowing conversation with a character in a game, that would be an incredible experience - but it’s the actors that we want to meet at Comic Con, not the LLM.
And that’s where generative AI and procedural generation in games are heading. It’s not just about creating bigger worlds—it’s about making those worlds feel alive and dynamic - and bringing us all along for the ride.
For me, that’s the most exciting part.
Stay tuned for further discussions with Ashley!
To see all the different technologies we house in our studio, visit www.studiot3d.com, or contact the studio team at info@studiot3d.com.
Comments