Improving style transfer with calibrated metrics

Mao Chuang Yeh, Shuai Tang, Anand Bhattad, Chuhang Zou, David Forsyth

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Style transfer produces a transferred image which is a rendering of a content image in the manner of a style image. We seek to understand how to improve style transfer.To do so requires quantitative evaluation procedures, but current evaluation is qualitative, mostly involving user studies. We describe a novel quantitative evaluation procedure. Our procedure relies on two statistics: the Effectiveness (E) statistic measures the extent that a given style has been transferred to the target, and the Coherence (C) statistic measures the extent to which the original image's content is preserved. Our statistics are calibrated to human preference: targets with larger values of E and C will reliably be preferred by human subjects in comparisons of style and content, respectively.We use these statistics to investigate relative performance of a number of Neural Style Transfer (NST) methods, revealing a number of intriguing properties. Admissible methods lie on a Pareto frontier (i.e. improving E reduces C, or vice versa). Three methods are admissible: Universal style transfer produces very good C but weak E; modifying the optimization used for Gatys' loss produces a method with strong E and strong C; and a modified cross-layer method has slightly better E at strong cost in C. While the histogram loss improves the E statistics of Gatys' method, it does not make the method admissible. Surprisingly, style weights have relatively little effect in improving EC scores, and most variability in transfer is explained by the style itself (meaning experimenters can be misguided by selecting styles). Our GitHub Link is available.1.

Original languageEnglish (US)
Title of host publicationProceedings - 2020 IEEE Winter Conference on Applications of Computer Vision, WACV 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3149-3157
Number of pages9
ISBN (Electronic)9781728165530
DOIs
StatePublished - Mar 2020
Event2020 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2020 - Snowmass Village, United States
Duration: Mar 1 2020Mar 5 2020

Publication series

NameProceedings - 2020 IEEE Winter Conference on Applications of Computer Vision, WACV 2020

Conference

Conference2020 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2020
CountryUnited States
CitySnowmass Village
Period3/1/203/5/20

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Improving style transfer with calibrated metrics'. Together they form a unique fingerprint.

  • Cite this

    Yeh, M. C., Tang, S., Bhattad, A., Zou, C., & Forsyth, D. (2020). Improving style transfer with calibrated metrics. In Proceedings - 2020 IEEE Winter Conference on Applications of Computer Vision, WACV 2020 (pp. 3149-3157). [9093351] (Proceedings - 2020 IEEE Winter Conference on Applications of Computer Vision, WACV 2020). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/WACV45572.2020.9093351