TY - GEN
T1 - New Metrics to Benchmark and Improve BIM Visibility Within a Synthetic Image Generation Process for Computer Vision Progress Tracking
AU - Nunez-Morales, Juan D.
AU - Hsu, Shun Hsiang
AU - Ibrahim, Amir
AU - Golparvar-Fard, Mani
N1 - Publisher Copyright:
© Canadian Society for Civil Engineering 2025.
PY - 2025
Y1 - 2025
N2 - Data collection, particularly ground-truth generation, is crucial for developing computer vision models used for construction progress monitoring applications. The performance of such models relies heavily on the quality of the data, which drives the effectiveness of machine learning algorithms. If data is not collected and subsequently managed correctly, the algorithms will fail, and the applicability of these models for construction monitoring applications will be degraded. In the absence of quality data, synthetic image generation using BIM has been widely studied to resolve data insufficiency issues. Because of the domain gap between synthetic and real-world images, most recent works have focused on rendering techniques to enhance the realism of lighting and texture. However, the impact of extrinsic camera parameters, which directly influence how BIM elements are rendered in camera views, is heavily underexplored. This leads to an over-utilization of weakly created synthetic ground-truth images. As a result, these images and their often-random camera position and viewpoints fail to reflect real-world visual perspectives needed for enterprise-grade solutions for monitoring construction progress. To improve the quality of synthetic construction environment datasets, this paper explores the integration of per-element visibility metrics to understand how different positional camera parameters impact the synthetic data collection pipeline and segmentation model performance improvement. This work is validated by comparing real-image segmentation accuracy through experiments using visibility metrics from different camera positions and directions. Finally, a discussion of how positional camera parameters can be selected for producing a more efficient and less biased synthetic dataset is presented.
AB - Data collection, particularly ground-truth generation, is crucial for developing computer vision models used for construction progress monitoring applications. The performance of such models relies heavily on the quality of the data, which drives the effectiveness of machine learning algorithms. If data is not collected and subsequently managed correctly, the algorithms will fail, and the applicability of these models for construction monitoring applications will be degraded. In the absence of quality data, synthetic image generation using BIM has been widely studied to resolve data insufficiency issues. Because of the domain gap between synthetic and real-world images, most recent works have focused on rendering techniques to enhance the realism of lighting and texture. However, the impact of extrinsic camera parameters, which directly influence how BIM elements are rendered in camera views, is heavily underexplored. This leads to an over-utilization of weakly created synthetic ground-truth images. As a result, these images and their often-random camera position and viewpoints fail to reflect real-world visual perspectives needed for enterprise-grade solutions for monitoring construction progress. To improve the quality of synthetic construction environment datasets, this paper explores the integration of per-element visibility metrics to understand how different positional camera parameters impact the synthetic data collection pipeline and segmentation model performance improvement. This work is validated by comparing real-image segmentation accuracy through experiments using visibility metrics from different camera positions and directions. Finally, a discussion of how positional camera parameters can be selected for producing a more efficient and less biased synthetic dataset is presented.
KW - Automated progress monitoring
KW - BIM
KW - Computer vision
KW - Deep learning
KW - Synthetic data
UR - http://www.scopus.com/inward/record.url?scp=85205114046&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85205114046&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-61499-6_16
DO - 10.1007/978-3-031-61499-6_16
M3 - Conference contribution
AN - SCOPUS:85205114046
SN - 9783031614989
T3 - Lecture Notes in Civil Engineering
SP - 209
EP - 221
BT - Proceedings of the Canadian Society for Civil Engineering Annual Conference 2023 - Construction Track
A2 - Desjardins, Serge
A2 - Poitras, Gérard J.
A2 - Nik-Bakht, Mazdak
PB - Springer
T2 - Canadian Society of Civil Engineering Annual Conference, CSCE 2023
Y2 - 24 May 2023 through 27 May 2023
ER -