Creating a fictional future: Keiken use mocap & game engines
Keiken, meaning experience in Japanese, is a hybrid practice. Through an intersection of moving image, new-media installation, virtual and augmented reality and gamified performance, Keiken test-drive impending futures in the realms of the “phygital”; physical and digital. Most recently, the collective collaborated with George Jasper Stone, a London-based CGI artist on Feel My Metaverse.
Feel My Metaverse uses game engines to build a fictional future, wanting to create stories that viewers can collectively believe in. Keiken’s first venture in creating cinematic film is set in a future when climate crisis has rendered Earth inhabitable and explores the daily lives of three characters and their experiences in the multiple realities – Pome Sector (a corporate wellness world), 068 (a roleplaying VR world), and Base Reality, or what we currently know as earth.
Capturing mocap data - and applying it to 3D characters - was new to Hana Omori and the Keiken team. The project also required an element of previz, to enable Keiken to visualise and capture their passes before re-creating them in Unreal Engine. There was further potential difficulty thanks to the choreography involving multiple interactions and occlusions.
The unique choreography, along with the setting up of the environments and characters for previz, and assistance with the post-production workflows, was captured at Target3D’s mocap studio. A 30-camera OptiTrack system was re-configured, maximising the use of overhead and low cameras to overcome the natural occlusion difficulties. Two dancers, Sakeema Crook and Linda Rocco in full body suits performed, while a separate virtual camera was driven by Keiken’s Creative Director, George Jasper Stone.
From capturing only a few different passes and angles, the Target3D team helped the clients to manipulate the sequence and re-create hundreds of other sequences.
Data was captured for both duo and single performance within a 5 hour window, allowing the data being processed that same day. Target3D provided FBX versions of the skeletal data which was imported into Cinema4D for retargeting, before being sent to Unreal Engine. The camera information was also sent to Unreal Engine for syncing with the motion capture data for the final render.
The result was a captivating 35 minute cinematic film, exhibited at the Jerwood Arts in central London with viewers interacting through their phones. Take a look at the preview here:
"Target3D is an incredible studio, with the latest motion capture technologies. They always go out of their way to make the experience beyond your expectation and they are super supportive technically. What is especially cool is that the team are always experimenting with these new technologies so when you go there you always learn something new!” Team Keiken
For motion capture shoots, and post-production support, speak to Target3D today.