Emotion Transfer for 3D Hand Motion using StarGAN

Jacky C.P. Chan, Ana Sabina Irimia, Edmond S.L. Ho*

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

2 Citations (Scopus)


In this paper, we propose a new data-driven framework for 3D hand motion emotion transfer. Specifically, we first capture high-quality hand motion using VR gloves. The hand motion data is then annotated with the emotion type and converted to images to facilitate the motion synthesis process and the new dataset will be available to the public. To the best of our knowledge, this is the first public dataset with annotated hand motions. We further formulate the emotion transfer for 3D hand motion as an Image-to-Image translation problem, and it is done by adapting the StarGAN framework. Our new framework is able to synthesize new motions, given target emotion type and an unseen input motion. Experimental results show that our framework can produce high quality and consistent hand motions.

Original languageEnglish
Title of host publicationComputer Graphics and Visual Computing, CGVC 2020 - Proceedings
EditorsPanagiotis D. Ritsos, Kai Xu
PublisherThe Eurographics Association
Number of pages8
ISBN (Electronic)9783038681229
Publication statusPublished - 10 Sept 2020
Event2020 Computer Graphics and Visual Computing, CGVC 2020 - Virtual, London, United Kingdom
Duration: 10 Sept 202011 Sept 2020

Publication series

NameComputer Graphics and Visual Computing, CGVC - Proceedings


Conference2020 Computer Graphics and Visual Computing, CGVC 2020
Country/TerritoryUnited Kingdom
Internet address

Scopus Subject Areas

  • Computer Vision and Pattern Recognition
  • Artificial Intelligence
  • Computer Graphics and Computer-Aided Design
  • Computer Science Applications

User-Defined Keywords

  • Emotion
  • Generative adversarial network
  • Hand animation
  • Motion capture
  • Style transfer

Cite this