Scale-Consistent Fusion: From Heterogeneous Local Sampling to Global Immersive Rendering

Wenpeng Xing, Jie Chen*, Zaifeng Yang, Qiang Wang, Yike Guo

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

3 Citations (Scopus)

Abstract

Image-based geometric modeling and novel view synthesis based on sparse large-baseline samplings are challenging but important tasks for emerging multimedia applications such as virtual reality and immersive telepresence. Existing methods fail to produce satisfactory results due to the limitation on inferring reliable depth information over such challenging reference conditions. With the popularization of commercial light field (LF) cameras, capturing LF images (LFIs) is as convenient as taking regular photos, and geometry information can be reliably inferred. This inspires us to use a sparse set of LF captures to render high-quality novel views globally. However, the fusion of LF captures from multiple angles is challenging due to the scale inconsistency caused by various capture settings. To overcome this challenge, we propose a novel scale-consistent volume rescaling algorithm that robustly aligns the disparity probability volumes (DPV) among different captures for scale-consistent global geometry fusion. Based on the fused DPV projected to the target camera frustum, novel learning-based modules (i.e., the attention-guided multi-scale residual fusion module, and the disparity field-guided deep re-regularization module), which comprehensively regularize noisy observations from heterogeneous captures for high-quality rendering of novel LFIs, have been proposed. Both quantitative and qualitative experiments over the Stanford Lytro Multi-view LF dataset show that the proposed method outperforms state-of-the-art methods significantly under different experiment settings for disparity inference and LF synthesis.
Original languageEnglish
Pages (from-to)6109-6123
Number of pages15
JournalIEEE Transactions on Image Processing
Volume31
Early online date16 Sept 2022
DOIs
Publication statusPublished - Dec 2022

User-Defined Keywords

  • Novel view synthesis
  • light field
  • disparity probability volumes rescaling
  • spatial-angular re-regularization
  • multi-scale residual fusion

Fingerprint

Dive into the research topics of 'Scale-Consistent Fusion: From Heterogeneous Local Sampling to Global Immersive Rendering'. Together they form a unique fingerprint.

Cite this