top of page

Researching Mocap Solution for Virtual Real-Time Dance Duets

Updated: Feb 16

Future thinking Goldsmiths University team collaborates with Target3D to investigate how computational arts could use motion capture to bring the world closer together for new dance performances.

Dan Strutt, Lecturer in the Department of Media, Communications and Cultural Studies at Goldsmiths, University of London, and his team of postgraduate students and industry professionals have been researching the potential application of new forms of accessible wireless and markerless motion capture in the creation, rehearsal, teaching and performance of choreographic dance work.


Having only used mocap that was pre-captured and pre-rendered on the team’s previous dance projects, live, real-time and generative use of motion capture was always their goal...


The aim To connect two dancers in two different locations - one in London and one in Singapore - through the internet, to dance together within a virtual environment.


For Dan, this wasn’t simply about hyping the tech up to be the next big thing, instead by experimenting and exploring he wanted to probe beyond superficial interest to see how genuinely useful this would be for dancers as a meaningful tool for choreographic communication.


The solution

Supported by Target3D and working within our Hoxton Studio, the Goldsmiths team created a virtual environment to connect Mavin Khoo, Creative Associate of Akram Khan Dance Company, with Melissa Kwek from LASALLE College of the Arts. Both Khoo, in our London space, and Kwek from a Singapore studio, danced in Noitom's Perception Neuron 32 V2 mocap suits that captured their data and distributed it 6750 miles through the network in real time. Each suit's IMU sensors were strapped to different parts of the dancers bodies, allowing the dancers to work together to devise an aesthetics of movement specific to this mode of remote working. Data was streamed from the sensors, across the world, and into graphics software to be instantly rendered, such that Goldsmiths and Lasalle could see and respond to each other's movement. A series of generative visualisations of the dance data were projected onto the walls of both locations simultaneously, the dancers responded in their own spaces, and the suits effectively communicated together.

Experimenting with both OpenFrameworks and Unreal Engine softwares,am were able to deploy and develop their differing capabilities and skills. The 2D graphic effects of OpenFrameworks were simple and minimal, working well for technical, learning and teaching applications. For example, when the dancers were synchronising movements it enabled them to tangibly visualise a direct connection. The games’ engine Unreal offered a more complex 3D spatial and cinematic effect, feeling very natural in terms of each dancer being able to see the dimensions of the others body and physically communicate with them. Although less technical, this felt very aesthetically pleasing and well-adapted for performance.

The result Dan Strutt, who was pleased with the result overall, explains, “The application does seem quite useful. This is not a substitute for live performance, for the feeling of really being there, but it does do something different. It permits certain effects that you can't do with a live performance, so we’re experimenting with what those are… the different kinds of interaction, the different visual aesthetics, and how you can use the data to change internal effects… all of this opens up a whole new medium which can be very pleasing for the audience. It can also change the dancer's perspective of themselves, of their own body.”

“If you are a dance maker”, Mavin Khoo observed, “this allows you to have a physical representation that enables you to work on something then objectively watch it. There’s a tangible sense of space and body which can be useful from a creative perspective. There is potential, with a consistent long term investment into research, to explore the sense of ownership for the dancer, to enable an emotional, poetic relationship.”



The future Thanks to funding from the Arts and Humanities Research Council (AHRC), the collaborative team can continue to overcome current limitations to take this system into development, creating a well-honed tool that’s more simple for the end user. Further experimentation hopes to bring a series of events: including one with dancers from 3 or 4 locations performing together in one virtual space, an event with digital programmers ‘hacking’ the dance movement data to define what the performance looks like, and a performance in an immersive environment with a live audience.


Target3D Founder, Allan Rankin, concludes, “Dr Strutt and his team have been developing this solution which has tremendous potential to have an impact on the way in which people utilise motion capture technologies over distance. Whilst the focus for this project is in the creative arts and dance industry, the principles and wider potentials for such a system can apply to numerous applications and markets.”


Help your story unfold with mocap solutions and consultation from Target3D.



 


Commentaires


Les commentaires ont été désactivés.
bottom of page