Controllable Radiance Fields for Dynamic Face Synthesis

Peiye Zhuang, Liqian Ma, Sanmi Koyejo, Alexander Schwing

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Recent work on 3D-aware image synthesis has achieved compelling results using advances in neural rendering. However, 3D-aware synthesis of face dynamics hasn't received much attention. Here, we study how to explicitly control generative model synthesis of face dynamics exhibiting non-rigid motion (e.g., facial expression change), while simultaneously ensuring 3D-awareness. For this we propose a Controllable Radiance Field (CoRF): 1) Motion control is achieved by embedding motion features within the layered latent motion space of a style-based generator; 2) To ensure consistency of background, motion features and subject-specific attributes such as lighting, texture, shapes, albedo, and identity, a face parsing net, a head regressor and an identity encoder are incorporated. On head image/video data we show that CoRFs are 3D-aware while enabling editing of identity, viewing directions, and motion.

Original languageEnglish (US)
Title of host publicationProceedings - 2022 International Conference on 3D Vision, 3DV 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages646-656
Number of pages11
ISBN (Electronic)9781665456708
DOIs
StatePublished - 2022
Event10th International Conference on 3D Vision, 3DV 2022 - Prague, Czech Republic
Duration: Sep 12 2022Sep 15 2022

Publication series

NameProceedings - 2022 International Conference on 3D Vision, 3DV 2022

Conference

Conference10th International Conference on 3D Vision, 3DV 2022
Country/TerritoryCzech Republic
CityPrague
Period9/12/229/15/22

Keywords

  • 3D aware
  • Faces
  • GANs
  • NeRFs

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Signal Processing

Fingerprint

Dive into the research topics of 'Controllable Radiance Fields for Dynamic Face Synthesis'. Together they form a unique fingerprint.

Cite this