FEATS: Synthetic feature tracks for structure from motion evaluation

Joseph Degol, Jae Yong Lee, Rajbir Kataria, Daniel Yuan, Timothy Bretl, Derek Hoiem

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present FEATS (Feature Extraction and Tracking Simulator), that synthesizes feature tracks using a camera trajectory and scene geometry (e.g. CAD, multi-view stereo). We introduce 2D feature and matching noise models that can be controlled using a few parameters. We also provide a new dataset of images and ground truth camera pose. We process this data (and a synthetic version) with several current SfM algorithms and show that the synthetic tracks are representative of the real tracks. We then show two practical uses of FEATS: (1) we generate hundreds of trajectories with varying noise and show that COLMAP is more robust to noise than OpenSfM and VisualSfM; and (2) we calculate 3D point error and show that accurate camera pose estimates do not guarantee accurate 3D maps.

Original languageEnglish (US)
Title of host publicationProceedings - 2018 International Conference on 3D Vision, 3DV 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages352-361
Number of pages10
ISBN (Electronic)9781538684252
DOIs
StatePublished - Oct 12 2018
Event6th International Conference on 3D Vision, 3DV 2018 - Verona, Italy
Duration: Sep 5 2018Sep 8 2018

Publication series

NameProceedings - 2018 International Conference on 3D Vision, 3DV 2018

Other

Other6th International Conference on 3D Vision, 3DV 2018
Country/TerritoryItaly
CityVerona
Period9/5/189/8/18

Keywords

  • 3D Reconstruction
  • Feature Modeling
  • Noise Modeling
  • SfM
  • Simulation
  • Structure from Motion

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'FEATS: Synthetic feature tracks for structure from motion evaluation'. Together they form a unique fingerprint.

Cite this