StyLitGAN: Image-Based Relighting via Latent Control

Anand Bhattad, James Soole, D. A. Forsyth

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We describe a novel method, StyLitGAN, for relighting and resurfacing images in the absence of labeled data. StyL-itGAN generates images with realistic lighting effects, including cast shadows, soft shadows, inter-reflections, and glossy effects, without the need for paired or CGI data. StyLit-GAN uses an intrinsic image method to decompose an image, followed by a search of the latent space of a pretrained Style-GAN to identify a set of directions. By prompting the model to fix one component (e.g., albedo) and vary another (e.g., shading), we generate relighted images by adding the identi-fied directions to the latent style codes. Quantitative metrics of change in albedo and lighting diversity allow us to choose effective directions using a forward selection process. Qual-itative evaluation confirms the effectiveness of our method.

Original languageEnglish (US)
Title of host publicationProceedings - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
PublisherIEEE Computer Society
Pages4231-4240
Number of pages10
ISBN (Electronic)9798350353006
DOIs
StatePublished - 2024
Event2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024 - Seattle, United States
Duration: Jun 16 2024Jun 22 2024

Publication series

NameProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
ISSN (Print)1063-6919

Conference

Conference2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
Country/TerritoryUnited States
CitySeattle
Period6/16/246/22/24

Keywords

  • Generative Models
  • Illumination
  • Image Decomposition
  • Relighting
  • StyleGAN

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'StyLitGAN: Image-Based Relighting via Latent Control'. Together they form a unique fingerprint.

Cite this