Temporally coherent completion of dynamic video

Jia Bin Huang, Sing Bing Kang, Narendra Ahuja, Johannes Kopf

Research output: Contribution to journalArticle

Abstract

We present an automatic video completion algorithm that synthesizes missing regions in videos in a temporally coherent fashion. Our algorithm can handle dynamic scenes captured using a moving camera. State-of-the-art approaches have difficulties handling such videos because viewpoint changes cause image-space motion vectors in the missing and known regions to be inconsistent. We address this problem by jointly estimating optical flow and color in the missing regions. Using pixel-wise forward/backward flow fields enables us to synthesize temporally coherent colors. We formulate the problem as a non-parametric patch-based optimization. We demonstrate our technique on numerous challenging videos.

Original languageEnglish (US)
Article number196
JournalACM Transactions on Graphics
Volume35
Issue number6
DOIs
StatePublished - Nov 2016

Fingerprint

Color
Optical flows
Flow fields
Pixels
Cameras

Keywords

  • Patch-based synthesis
  • Video completion

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design

Cite this

Temporally coherent completion of dynamic video. / Huang, Jia Bin; Kang, Sing Bing; Ahuja, Narendra; Kopf, Johannes.

In: ACM Transactions on Graphics, Vol. 35, No. 6, 196, 11.2016.

Research output: Contribution to journalArticle

Huang, Jia Bin ; Kang, Sing Bing ; Ahuja, Narendra ; Kopf, Johannes. / Temporally coherent completion of dynamic video. In: ACM Transactions on Graphics. 2016 ; Vol. 35, No. 6.
@article{c9885295bf14442da7330c70febb0bc7,
title = "Temporally coherent completion of dynamic video",
abstract = "We present an automatic video completion algorithm that synthesizes missing regions in videos in a temporally coherent fashion. Our algorithm can handle dynamic scenes captured using a moving camera. State-of-the-art approaches have difficulties handling such videos because viewpoint changes cause image-space motion vectors in the missing and known regions to be inconsistent. We address this problem by jointly estimating optical flow and color in the missing regions. Using pixel-wise forward/backward flow fields enables us to synthesize temporally coherent colors. We formulate the problem as a non-parametric patch-based optimization. We demonstrate our technique on numerous challenging videos.",
keywords = "Patch-based synthesis, Video completion",
author = "Huang, {Jia Bin} and Kang, {Sing Bing} and Narendra Ahuja and Johannes Kopf",
year = "2016",
month = "11",
doi = "10.1145/2980179.2982398",
language = "English (US)",
volume = "35",
journal = "ACM Transactions on Computer Systems",
issn = "0730-0301",
publisher = "Association for Computing Machinery (ACM)",
number = "6",

}

TY - JOUR

T1 - Temporally coherent completion of dynamic video

AU - Huang, Jia Bin

AU - Kang, Sing Bing

AU - Ahuja, Narendra

AU - Kopf, Johannes

PY - 2016/11

Y1 - 2016/11

N2 - We present an automatic video completion algorithm that synthesizes missing regions in videos in a temporally coherent fashion. Our algorithm can handle dynamic scenes captured using a moving camera. State-of-the-art approaches have difficulties handling such videos because viewpoint changes cause image-space motion vectors in the missing and known regions to be inconsistent. We address this problem by jointly estimating optical flow and color in the missing regions. Using pixel-wise forward/backward flow fields enables us to synthesize temporally coherent colors. We formulate the problem as a non-parametric patch-based optimization. We demonstrate our technique on numerous challenging videos.

AB - We present an automatic video completion algorithm that synthesizes missing regions in videos in a temporally coherent fashion. Our algorithm can handle dynamic scenes captured using a moving camera. State-of-the-art approaches have difficulties handling such videos because viewpoint changes cause image-space motion vectors in the missing and known regions to be inconsistent. We address this problem by jointly estimating optical flow and color in the missing regions. Using pixel-wise forward/backward flow fields enables us to synthesize temporally coherent colors. We formulate the problem as a non-parametric patch-based optimization. We demonstrate our technique on numerous challenging videos.

KW - Patch-based synthesis

KW - Video completion

UR - http://www.scopus.com/inward/record.url?scp=85031674258&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85031674258&partnerID=8YFLogxK

U2 - 10.1145/2980179.2982398

DO - 10.1145/2980179.2982398

M3 - Article

AN - SCOPUS:85031674258

VL - 35

JO - ACM Transactions on Computer Systems

JF - ACM Transactions on Computer Systems

SN - 0730-0301

IS - 6

M1 - 196

ER -