The Viral Dance
“It went viral.” The words every artist, advertiser or influencer wants to hear. But what does it take to ‘go viral’? It’s generally considered a mystery.
Through a curated exhibition, a viral experiment and a dance show, the dancing and filmmaking duo DAAE/NORDAHL explored the concepts of connectivity and sharing, and what it means to be human today.
Filmmaker Joanna Nordahl’s aim was to explore 'the digital self' and how we portray ourselves online so the lines between real and 'digital' were joined - specifically with them dancing along with their digital selves.
The human brain is wired to detect very subtle cues in facial patterns. The problem filmmakers can run into when trying to make realistic human animations is that even minor errors can led the viewer to to be uncomfortable with the animation - associating the problem with a physical defect, illness or death. That phenomenon is usually called the uncanny valley.
Nordahl and dancer Ludvig Daae fed 250 viral dance videos from the past twenty years into a computer program, a neural network, that studies movement patterns. The choreographies were separated into individual dance moves before the program analysed the time variations, spatial directions, types of movements and their relative frequency. By doing this the system was trained to simulate a brand new choreography: the world’s first AI-created viral dance, the ultimate viral dance.
It seems clear that electronic technology has given us a new way to look … One can make things with it, one doesn’t have to put things in one already knows…. one can make discoveries…
Choreographer Merce Cunningham, 1968
To keep your mocap, AI or animation project on target, get in touch with the Target3D team today.
Collaborating with a team of experts, producers, designers and artists were Target3D’s Ashley Keeler as the project’s Mocap Tech Director and Petros De Doncker as the Mocap Technician/Developer.
With one week to capture and deliver the data, the Target3D team used a combination of OptiTrack Prime 17Ws for the body tracking and a 'quick and dirty' iClone facial tracking solution. Ashley and Petros worked with 3D artist, Nicole Ruggiero, to apply the animations to her models working collaboratively between London and NYC to deliver the final project.
It's not the same to watch it on video... the whole point of the show is that it's a live experience - but you can take a sneak peak here