Emotion Control of Unstructured Dance Movements


Emotion Control of Unstructured Dance Movements

Andreas Aristidou, Qiong Zeng, Efstathios Stavrakis, KangKang Yin, Daniel Cohen-Or, Yiorgos Chrysanthou, Baoquan Chen

ACM SIGGRAPH/ Eurographics Symposium on Computer Animation, SCA'17. Eurographics Association, 2017.

We present a motion stylization technique suitable for highly expressive mocap data, such as contemporary dances. The method varies the emotion expressed in a motion by modifying its underlying geometric features. Even non-expert users can stylize dance motions by supplying an emotion modification as the single parameter of our algorithm.

[DOI] [paper] [bibtex] [Supplementary Materials]

The dance motion capture data used can be downloaded from the Dance Motion Capture Database website.


Abstract


Motion capture technology has enabled the acquisition of high quality human motions for animating digital characters with extremely high fidelity. However, despite all the advances in motion editing and synthesis, it remains an open problem to modify pre-captured motions that are highly expressive, such as contemporary dances, for stylization and emotionalization. In this work, we present a novel approach for stylizing such motions by using emotion coordinates defined by the Russell's Circumplex Model (RCM). We extract and analyze a large set of body and motion features, based on the Laban Movement Analysis (LMA), and choose the effective and consistent features for characterizing emotions of motions. These features provide a mechanism not only for deriving the emotion coordinates of a newly input motion, but also for stylizing the motion to express a different emotion without having to reference the training data. Such decoupling of the training data and new input motions eliminates the necessity of manual processing and motion registration. We implement the two-way mapping between the motion features and emotion coordinates through Radial Basis Function (RBF) regression and interpolation, which can stylize freestyle highly dynamic dance movements at interactive rates. Our results and user studies demonstrate the effectiveness of the stylization framework with a variety of dance movements exhibiting a diverse set of emotions.



The main contributions of this work include:

  • We are the first to analyze and control emotions for unstructured motions with an intuitive interface based on the RCM diagram, without the need for manual processing and registration of the training and input data.
  • We present a simple statistic method to identify performerindependent consistent LMA motion features that are effective for emotion expression.
  • We utilize RBF (Radial Basis Function) regression and interpolation for two-way mapping between motion features and RCM emotion coordinates. This enables stylizing motions with emotions in the full RCM space more than the discrete set of captured emotions.


Acknowlegments


This project was supported by the National Key Research & Development Plan of China (No. 2016YFB1001404) and National Natural Science Foundation of China (No. 61602273); the Israeli Science Foundation; and the European Regional Development Fund and the Republic of Cyprus through the Research Promotion Foundation under contract ΔΙΔΑΚΤΩΡ/0311/73.




© 2017 Andreas Aristidou