Abstract
The creation of lifelike digital humans is a critical task in entertainment, education, and the Metaverse, motivated by the demand for immersive virtual experiences. This talk introduces our creative pipelines in addressing the challenge from efficient 3D representation to capture human appearance, AI-driven motion capture for realistic dynamics, and physics-based modeling to ensure natural interactions with virtual environments. Our generative rendering techniques bring these characters to life with photorealistic visuals, enabling real-time and interactive digital humans for various applications.
| Original language | English |
|---|---|
| Publication status | Published - 11 Feb 2025 |
| Event | HKBU-NVIDIA Joint Symposium 2025: ART-TECH - Hong Kong Baptist University, Hong Kong, China Duration: 11 Feb 2025 → 11 Feb 2025 https://www.comp.hkbu.edu.hk/hkbu-nvidia-sym2025/#schedule (Link to conference schedule) |
Symposium
| Symposium | HKBU-NVIDIA Joint Symposium 2025 |
|---|---|
| Country/Territory | Hong Kong, China |
| Period | 11/02/25 → 11/02/25 |
| Internet address |
|
Fingerprint
Dive into the research topics of 'Learning the Appearance, Dynamics, and Physics of Animatable Virtual Avatars'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver