DrawingInStyles: Portrait Image Generation and Editing With Spatially Conditioned StyleGAN

  • Wanchao Su
  • , Hui Ye
  • , Shu Yu Chen
  • , Lin Gao
  • , Hongbo Fu*
  • *Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

24 Citations (Scopus)

Abstract

The research topic of sketch-to-portrait generation has witnessed a boost of progress with deep learning techniques. The recently proposed StyleGAN architectures achieve state-of-the-art generation ability but the original StyleGAN is not friendly for sketch-based creation due to its unconditional generation nature. To address this issue, we propose a direct conditioning strategy to better preserve the spatial information under the StyleGAN framework. Specifically, we introduce Spatially Conditioned StyleGAN (SC-StyleGAN for short), which explicitly injects spatial constraints to the original StyleGAN generation process. We explore two input modalities, sketches and semantic maps, which together allow users to express desired generation results more precisely and easily. Based on SC-StyleGAN, we present DrawingInStyles, a novel drawing interface for non-professional users to easily produce high-quality, photo-realistic face images with precise control, either from scratch or editing existing ones. Qualitative and quantitative evaluations show the superior generation ability of our method to existing and alternative solutions. The usability and expressiveness of our system are confirmed by a user study.

Original languageEnglish
Pages (from-to)4074-4088
Number of pages15
JournalIEEE Transactions on Visualization and Computer Graphics
Volume29
Issue number10
DOIs
Publication statusPublished - 30 May 2022

User-Defined Keywords

  • conditional generation
  • data-driven approaches
  • Sketch-based portrait generation
  • StyleGAN
  • suggestive interfaces

Fingerprint

Dive into the research topics of 'DrawingInStyles: Portrait Image Generation and Editing With Spatially Conditioned StyleGAN'. Together they form a unique fingerprint.

Cite this