Adult2Child: Motion Style Transfer using CycleGANs


Adult2Child: Motion Style Transfer using CycleGANs

Yuzhu Dong, Andreas Aristidou, Ariel Shamir, Moshe Mahler, Eakta Jain

ACM SIGGRAPH Conference on Motion, Interaction, and Games, MIG'20, October 2020.

This paper presents an effective style translation method that tranfers adult motion capture data to the style of child motion using CycleGANs. Our method allows training on unpaired data using a relatively small number of sequences of child and adult motions that are not required to be temporally aligned. We have also captured high quality adult2child 3D motion capture data that are publicly available for future studies.

[DOI] [paper] [Adult2Child Motion Database] [bibtex]


Other Related Publications


  • Adult2Child Age Regression Using CycleGANs
    Thomas Domas, Yuzhu Dong, Brendan John, Ariel Shamir, Andreas Aristidou, Eakta Jain
    In Proceedings of the ACM Symposium on Applied Perception (SAP'19), Barcelona, Spain, September 19-20, 2019.
    [paper] [bibtex]

Abstract


Child characters are commonly seen in leading roles in top-selling video games. Previous studies have shown that child motions are perceptually and stylistically different from those of adults. Creating motion for these characters by motion capturing children is uniquely challenging because of confusion, lack of patience and regulations. Retargeting adult motion, which is much easier to record, onto child skeletons, does not capture the stylistic differences. In this paper, we propose that style translation is an effective way to transform adult motion capture data to the style of child motion. Our method is based on CycleGAN, which allows training on a relatively small number of sequences of child and adult motions that do not even need to be temporally aligned. Our adult2child network converts short sequences of motions called motion words from one domain to the other. The network was trained using a motion capture database collected by our team containing 23 locomotion and exercise motions. We conducted a perception study to evaluate the success of style translation algorithms, including our algorithm and recently presented style translation neural networks. Results show that the translated adult motions are recognized as child motions significantly more often than adult motions.

The main contributions of this work include:

  • Architecture: We are the first to adapt a cycleGAN architecture for motion style transfer in such a way that the neural network is able to alter the timing of the motion. We redesigned the generators and the discriminators to extract meaningful features from motion inputs. Our demonstration of this adapted architecture opens the path forward for style transfer networks that do not need temporally aligned data. We further demonstrate the advantage of temporal coherence loss terms within the cycleGAN framework to create natural and smooth output motions.
  • Representation: We espouse joint angles as an animation-centric representation scheme for this architecture and task. This representation sets us apart from previous machine learning-centric work that has used joint positions to make it easier to train the network. An animation-centric approach, in contrast, looks ahead to how the output of the neural network will be bound to a skeleton and skinned.We further add to evidence in favor of motion words as motion representation schema that can encode both temporal and spatial changes.
  • Dataset: We release a high quality dataset of children’s movements on a publicly accessible repository. This dataset is the first of its kind as it captures via an optical motion capture system the natural behavior of preteen children in response to verbal prompts.


Acknowlegments


This work was supported in part by Gartner Group Graduate fellowship; the Israel Science Foundation (grant No. 1390/19); the European Union's Horizon 2020 Research and Innovation Programme under Grant Agreement No 739578 and the Government of the Republic of Cyprus through the Directorate General for European Programmes, Coordination and Development.




© 2017 Andreas Aristidou