Real-time user-guided image colorization with learned deep priors

Richard Yi Zhang, Jun Yan Zhu, Phillip Isola, Xinyang Geng, Angela S. Lin, Tianhe Yu, Alexei A. Efros

Research output: Contribution to journalConference articlepeer-review

Abstract

We propose a deep learning approach for user-guided image colorization. The system directly maps a grayscale image, along with sparse, local user "hints" to an output colorization with a Convolutional Neural Network (CNN). Rather than using hand-defined rules, the network propagates user edits by fusing low-level cues along with high-level semantic information, learned from large-scale data. We train on a million images, with simulated user inputs. To guide the user towards efficient input selection, the system recommends likely colors based on the input image and current user inputs. The colorization is performed in a single feed-forward pass, enabling realtime use. Even with randomly simulated user inputs, we show that the proposed system helps novice users quickly create realistic colorizations, and offers large improvements in colorization quality with just a minute of use. In addition, we demonstrate that the framework can incorporate other user "hints" to the desired colorization, showing an application to color histogram transfer.

Original languageEnglish (US)
Article number119
JournalACM Transactions on Graphics
Volume36
Issue number4
DOIs
StatePublished - Jan 1 2017
Externally publishedYes
EventACM SIGGRAPH 2017 - Los Angeles, United States
Duration: Jul 30 2017Aug 3 2017

Keywords

  • Colorization
  • Deep learning
  • Edit propagation
  • Interactive colorization
  • Vision for graphics

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Real-time user-guided image colorization with learned deep priors'. Together they form a unique fingerprint.

Cite this