Emotion transfer for 3d hand and full body motion using stargan

Jacky Chun Pong Chan, Edmond Shu Lim Ho*

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

5 Citations (Scopus)


In this paper, we propose a new data-driven framework for 3D hand and full-body motion emotion transfer. Specifically, we formulate the motion synthesis task as an image-to-image translation problem. By presenting a motion sequence as an image representation, the emotion can be transferred by our framework using StarGAN. To evaluate our proposed method’s effectiveness, we first conducted a user study to validate the perceived emotion from the captured and synthesized hand motions. We further evaluate the synthesized hand and full body motions qualitatively and quantitatively. Experimental results show that our synthesized motions are comparable to the captured motions and those created by an existing method in terms of naturalness and visual quality.

Original languageEnglish
Article number38
Number of pages19
Issue number3
Publication statusPublished - 22 Mar 2021

Scopus Subject Areas

  • Human-Computer Interaction
  • Computer Networks and Communications

User-Defined Keywords

  • Body motion
  • Emotion
  • Generative adversarial network
  • Hand animation
  • Motion capture
  • Skeletal motion
  • Style transfer
  • User study

Cite this