Abstract
In this paper, we propose a new data-driven framework for 3D hand and full-body motion emotion transfer. Specifically, we formulate the motion synthesis task as an image-to-image translation problem. By presenting a motion sequence as an image representation, the emotion can be transferred by our framework using StarGAN. To evaluate our proposed method’s effectiveness, we first conducted a user study to validate the perceived emotion from the captured and synthesized hand motions. We further evaluate the synthesized hand and full body motions qualitatively and quantitatively. Experimental results show that our synthesized motions are comparable to the captured motions and those created by an existing method in terms of naturalness and visual quality.
Original language | English |
---|---|
Article number | 38 |
Number of pages | 19 |
Journal | Computers |
Volume | 10 |
Issue number | 3 |
DOIs | |
Publication status | Published - 22 Mar 2021 |
Scopus Subject Areas
- Human-Computer Interaction
- Computer Networks and Communications
User-Defined Keywords
- Body motion
- Emotion
- Generative adversarial network
- Hand animation
- Motion capture
- Skeletal motion
- Style transfer
- User study