Dance Video with Special Effects

In this dance video my goal was to explore a realistic way to superimpose interactive special effects over a person. Creating a sort of augmented reality dance piece. The video starts by sweeping a camera around a point cloud rendering of Sunny using a Kinect v2 camera. The graphics were created in TouchDesigner. The particles were generated from shapes attached to Sunny’s virtual skeleton. Below are more details about interactivity and special effects.


There are three things being collected from Sunny: 1) the position of his limbs; 2) the volume of his body, both using a Kinect v2; and 3) the rotation of his hands using two wristbands built in collaboration with Intel. The limb position instructs TouchDesigner where to place the cubes and particles. The wristbands control the direction of the particles.

Special Effects

Occluding the cubes and particles behind Sunny made the virtual elements look more realistic. To do this I had to position the depth information from his body, provided by the Kinect’s depth sensor, on the virtual world and transform it into a mesh. That mesh is not visible in the final rendering but it is able to effectively place Sunny in the middle of the virtual world. It looks particularly cool at 0:22 (someone told me it reminded them of the iPod ads).

Camera Position Tracking

I could have stationed the camera and shot every thing from the same angle. But having the camera move around made the virtual objects more realistic. However, if the video camera moved so had the virtual camera, or Sunny’s placement and that of the augmented elements would not have matched. I used NextStage Pro to track the position of the physical camera and then mapped that to the position of the virtual camera. NextStage Pro is pretty cool. You place retro-reflective markers against a surface and the application learns its X Y Z position in relation to those markers. You can see a dozen of them on my wall.

Future Directions

To make this look cooler I would want to experiment more with lighting. The augmented scene can look much more realistic if both the room light and virtual light match. On top of that, more time and a bigger space can make for an exciting and creative project.
Visuals: Yago de Quay
Performer: Sunny Jun Shen
Music: Zackery Wilson