Rendering synthetic objects into legacy photographs

Kevin Karsch, Varsha Hedau, David Forsyth, Derek Hoiem

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose a method to realistically insert synthetic objects into existing photographs without requiring access to the scene or any additional scene measurements. With a single image and a small amount of annotation, our method creates a physical model of the scene that is suitable for realistically rendering synthetic objects with diffuse, specular, and even glowing materials while accounting for lighting interactions between the objects and the scene. We demonstrate in a user study that synthetic images produced by our method are confusable with real scenes, even for people who believe they are good at telling the difference. Further, our study shows that our method is competitive with other insertion methods while requiring less scene information. We also collected new illumination and reflectance datasets; renderings produced by our system compare well to ground truth. Our system has applications in the movie and gaming industry, as well as home decorating and user content creation, among others.

Original languageEnglish (US)
Title of host publicationProceedings of the 2011 SIGGRAPH Asia Conference, SA'11
StatePublished - Dec 1 2011
Event2011 SIGGRAPH Asia Conference, SA'11 - Hong Kong, China
Duration: Dec 12 2011Dec 15 2011

Publication series

NameProceedings of the 2011 SIGGRAPH Asia Conference, SA'11

Other

Other2011 SIGGRAPH Asia Conference, SA'11
CountryChina
CityHong Kong
Period12/12/1112/15/11

Fingerprint

Lighting
Industry

Keywords

  • Computational photography
  • Image-based rendering
  • Light estimation
  • Photo editing

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Vision and Pattern Recognition
  • Software

Cite this

Karsch, K., Hedau, V., Forsyth, D., & Hoiem, D. (2011). Rendering synthetic objects into legacy photographs. In Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11 [157] (Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11).

Rendering synthetic objects into legacy photographs. / Karsch, Kevin; Hedau, Varsha; Forsyth, David; Hoiem, Derek.

Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11. 2011. 157 (Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Karsch, K, Hedau, V, Forsyth, D & Hoiem, D 2011, Rendering synthetic objects into legacy photographs. in Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11., 157, Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11, 2011 SIGGRAPH Asia Conference, SA'11, Hong Kong, China, 12/12/11.
Karsch K, Hedau V, Forsyth D, Hoiem D. Rendering synthetic objects into legacy photographs. In Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11. 2011. 157. (Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11).
Karsch, Kevin ; Hedau, Varsha ; Forsyth, David ; Hoiem, Derek. / Rendering synthetic objects into legacy photographs. Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11. 2011. (Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11).
@inproceedings{914151748f1d4d109fabd251a818b4f6,
title = "Rendering synthetic objects into legacy photographs",
abstract = "We propose a method to realistically insert synthetic objects into existing photographs without requiring access to the scene or any additional scene measurements. With a single image and a small amount of annotation, our method creates a physical model of the scene that is suitable for realistically rendering synthetic objects with diffuse, specular, and even glowing materials while accounting for lighting interactions between the objects and the scene. We demonstrate in a user study that synthetic images produced by our method are confusable with real scenes, even for people who believe they are good at telling the difference. Further, our study shows that our method is competitive with other insertion methods while requiring less scene information. We also collected new illumination and reflectance datasets; renderings produced by our system compare well to ground truth. Our system has applications in the movie and gaming industry, as well as home decorating and user content creation, among others.",
keywords = "Computational photography, Image-based rendering, Light estimation, Photo editing",
author = "Kevin Karsch and Varsha Hedau and David Forsyth and Derek Hoiem",
year = "2011",
month = "12",
day = "1",
language = "English (US)",
isbn = "9781450308076",
series = "Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11",
booktitle = "Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11",

}

TY - GEN

T1 - Rendering synthetic objects into legacy photographs

AU - Karsch, Kevin

AU - Hedau, Varsha

AU - Forsyth, David

AU - Hoiem, Derek

PY - 2011/12/1

Y1 - 2011/12/1

N2 - We propose a method to realistically insert synthetic objects into existing photographs without requiring access to the scene or any additional scene measurements. With a single image and a small amount of annotation, our method creates a physical model of the scene that is suitable for realistically rendering synthetic objects with diffuse, specular, and even glowing materials while accounting for lighting interactions between the objects and the scene. We demonstrate in a user study that synthetic images produced by our method are confusable with real scenes, even for people who believe they are good at telling the difference. Further, our study shows that our method is competitive with other insertion methods while requiring less scene information. We also collected new illumination and reflectance datasets; renderings produced by our system compare well to ground truth. Our system has applications in the movie and gaming industry, as well as home decorating and user content creation, among others.

AB - We propose a method to realistically insert synthetic objects into existing photographs without requiring access to the scene or any additional scene measurements. With a single image and a small amount of annotation, our method creates a physical model of the scene that is suitable for realistically rendering synthetic objects with diffuse, specular, and even glowing materials while accounting for lighting interactions between the objects and the scene. We demonstrate in a user study that synthetic images produced by our method are confusable with real scenes, even for people who believe they are good at telling the difference. Further, our study shows that our method is competitive with other insertion methods while requiring less scene information. We also collected new illumination and reflectance datasets; renderings produced by our system compare well to ground truth. Our system has applications in the movie and gaming industry, as well as home decorating and user content creation, among others.

KW - Computational photography

KW - Image-based rendering

KW - Light estimation

KW - Photo editing

UR - http://www.scopus.com/inward/record.url?scp=84855435661&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84855435661&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84855435661

SN - 9781450308076

T3 - Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11

BT - Proceedings of the 2011 SIGGRAPH Asia Conference, SA'11

ER -