Interactive partner control in close interactions for real-time applications

Shu Lim HO*, Chun Pong CHAN, Taku Komura, Howard Leung

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

37 Citations (Scopus)

Abstract

This article presents a new framework for synthesizing motion of a virtual character in response to the actions performed by a user-controlled character in real time. In particular, the proposed method can handle scenes in which the characters are closely interacting with each other such as those in partner dancing and fighting. In such interactions, coordinating the virtual characters with the human player automatically is extremely difficult because the system has to predict the intention of the player character. In addition, the style variations from different users affect the accuracy in recognizing the movements of the player character when determining the responses of the virtual character. To solve these problems, our framework makes use of the spatial relationship-based representation of the body parts called interaction mesh, which has been proven effective for motion adaptation. The method is computationally efficient, enabling real-time character control for interactive applications. We demonstrate its effectiveness and versatility in synthesizing a wide variety of motions with close interactions.

Original languageEnglish
Article number21
JournalACM Transactions on Multimedia Computing, Communications and Applications
Volume9
Issue number3
DOIs
Publication statusPublished - Jun 2013

Scopus Subject Areas

  • Hardware and Architecture
  • Computer Networks and Communications

User-Defined Keywords

  • Character animation
  • Close interactions
  • Motion capture
  • Virtual partner

Cite this