Crowdsourcing BIM-guided collection of construction material library from site photologs

Research output: Contribution to journalArticlepeer-review

Abstract

Background: With advances in technologies that enabled massive visual data collection and BIM, the AEC industry now has an unprecedented amount of visual data (e.g., images and videos) and BIMs. One of the past efforts to leverage these data includes the Construction Material Library (CML) that was created for inferring construction progress by automatically detecting construction materials. CML has a limited number of construction material classes because it is merely impossible for an individual or a group of researchers to collect all possible variations of construction materials. Methods: This paper proposes a web-based platform that streamlines the data collection process for creating annotated material patches guided by BIM overlays. Result: Construction site images with BIM overlays are automatically generated after image-based 3D reconstruction. These images are deployed on a web-based platform for annotations. Conclusion: The proposed crowdsourcing method using this platform has potential to scale up data collection for expanding the existing CML. A case study was conducted to validate the feasibility of the proposed method and to improve the web interface before deployment to a public cloud environment.

Original languageEnglish (US)
Article number14
JournalVisualization in Engineering
Volume5
Issue number1
DOIs
StatePublished - Dec 1 2017
Externally publishedYes

Keywords

  • BIM
  • Construction material library
  • Crowdsource
  • Machine learning
  • Photogrametry

ASJC Scopus subject areas

  • Modeling and Simulation
  • Engineering (miscellaneous)
  • Computer Vision and Pattern Recognition
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Crowdsourcing BIM-guided collection of construction material library from site photologs'. Together they form a unique fingerprint.

Cite this