[ISEA2016] Artists Talk: Ewelina Bakala, Yaying Zhang & Philippe Pasquier — Mavi: Movement Aesthetic Visualization Tool and its Use for Dance Video Making and Prototyping

Artists Statement

Introduction
Dance and video are two fields in the process of constant exchange. Along the time dance and video were combined in very different modalities: video recordings of dance, dance as a part of a movie, dance documental movies, movies about dance, cinedance and more recently: dance video clips [1]. With the technological advances, it became possible to work with motion capture data, the body movement data in three dimensions along with a wide variety of features like joints positions, velocity, acceleration, etc., which describe in a numeric way the dance practice and can be easily interpreted by the computer, and used for artistic video rendering [2]. We explore this kind of data, for the sake of generating video sequences, and create MAVi, a new tool for video creation that allow movement data visualization, real-time manipulation, and recording.

MAVi: System Description
The main objectives of the created tool were to integrate mocap or animation data files, or real-time Kinect 1 data, visualize the movement data, allow real-time control of data visualization, record the visualization frame-byframe. We have selected triangulations as the visualization modality to explore its aesthetic potential in space and movement visualization
Modules and Implementation Version 1.0 of the tool implements following modules:
– Input data: allows to define the type of data to work with. It can be BVH formatted skeleton joints data, CSV files with markers position provided by Vicon motion capture system [3] or Kinect user map or skeleton points provided by SimpleOpenNI library [4].
– Triangles: makes possible to manipulate properties like transparency or texture (color scale, pictures or video) of every single triangle. It allows to upload videos or pictures for texture generation and controls the number of triangles from previous frames in the scene.
– Background: manipulates the background surrounding the dancer.
– Auto rotation: controls automatic camera movements.
– Lights setting: manages the light setting of the scene.
– Noise: adds noise to the data points.
– Frame recording: record generated frames.
We decided to develop our tool in Processing, taking into account its wide appliance in rapid prototyping and the variety of external libraries.

Examples of use
Here are some video examples of the user of MAVi modalities:
Kinect User Map
Color Texture
Background
Video Texture
Internal, an art piece that reflects on the complexity of human beings

References

  1. Noel Carroll, “Toward a Definition of Moving-Picture Dance,” Dance Research Journal 33.01, (2001): 46-61.
  2. Berto Gonzalez, et al., “Dance-inspired technology, technology-inspired dance,” Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design. ACM, (2012): 398-407.
  3. “Motion Capture Systems | VICON.” Accessed: December 01, 2015. vicon.com.
  4. “simple-openni – OpenNI library for Processing – Google Project Hosting.” Accessed: December 01, 2015. https://code.google.com/p/simple-openni/.
  • Ewelina Bakala, Yaying Zhang & Philippe Pasquier, School of Interactive arts and Technology, Simon Fraser University, Vancouver, Canada