TY - GEN
T1 - High-precision and infrastructure-independent mobile augmented reality system for context-aware construction and facility management applications
AU - Bae, Hyojoon
AU - Golparvar-Fard, Mani
AU - White, Jules
PY - 2013
Y1 - 2013
N2 - This paper presents a new context-aware mobile augmented reality system that provides rapid and robust on-site access to construction project information such as drawings, specifications, schedules, and budgets. The mobile augmented reality system does not need any RF-based location tracking (e.g., GPS or WLAN) or optical fiducial markers for tracking a user's position. Rather, the user's location and orientation are automatically and purely derived by comparing photographs from the user's phone to a 3D point cloud model created from a set of site photographs. After generating a 3D point cloud model of construction site, field personnel can use mobile devices to take pictures of building elements and be presented on-site with a detailed list of project information related to the visible construction elements in an augmented reality format. The experimental results show that (1) the underlying 3D reconstruction module of the system generates more complete 3D point cloud models, and faster than other state-of-the-art Structure-from-Motion(SfM) algorithms; (2) the localization method is an order of magnitude more accurate than the state-of-the-art solutions, and can provide acceptable tolerances of most on-site engineering applications. Using an actual construction case study, the perceived benefits and limitations of the proposed method for on-site context-aware applications are discussed in detail.
AB - This paper presents a new context-aware mobile augmented reality system that provides rapid and robust on-site access to construction project information such as drawings, specifications, schedules, and budgets. The mobile augmented reality system does not need any RF-based location tracking (e.g., GPS or WLAN) or optical fiducial markers for tracking a user's position. Rather, the user's location and orientation are automatically and purely derived by comparing photographs from the user's phone to a 3D point cloud model created from a set of site photographs. After generating a 3D point cloud model of construction site, field personnel can use mobile devices to take pictures of building elements and be presented on-site with a detailed list of project information related to the visible construction elements in an augmented reality format. The experimental results show that (1) the underlying 3D reconstruction module of the system generates more complete 3D point cloud models, and faster than other state-of-the-art Structure-from-Motion(SfM) algorithms; (2) the localization method is an order of magnitude more accurate than the state-of-the-art solutions, and can provide acceptable tolerances of most on-site engineering applications. Using an actual construction case study, the perceived benefits and limitations of the proposed method for on-site context-aware applications are discussed in detail.
UR - http://www.scopus.com/inward/record.url?scp=84887389029&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84887389029&partnerID=8YFLogxK
U2 - 10.1061/9780784413029.080
DO - 10.1061/9780784413029.080
M3 - Conference contribution
AN - SCOPUS:84887389029
SN - 9780784477908
T3 - Computing in Civil Engineering - Proceedings of the 2013 ASCE International Workshop on Computing in Civil Engineering
SP - 637
EP - 644
BT - Computing in Civil Engineering - Proceedings of the 2013 ASCE International Workshop on Computing in Civil Engineering
PB - American Society of Civil Engineers
T2 - 2013 ASCE International Workshop on Computing in Civil Engineering, IWCCE 2013
Y2 - 23 June 2013 through 25 June 2013
ER -