Automatic recognition and localization of underground pipelines in GPR B-scans using a deep learning model

Hai Liu, Yunpeng Yue, Chao Liu, B. F. Spencer, Jie Cui

Research output: Contribution to journalArticlepeer-review

Abstract

Ground penetrating radar (GPR) is a popular non-destructive method for detecting and locating underground pipelines. However, manual interpretation of a large number of GPR B-scan images is time-consuming, and the results highly relies on the practitioner's experience and the priori information at hands. An automatic GPR method for recognition and localization of underground pipelines is proposed based on a deep learning model in the paper. Firstly, a dataset containing 3,824 real GPR B-scans of pipelines is established. Secondly, a You Only Look Once version 3 (YOLOv3) model is trained to recognize the regions of the underground pipelines in a GPR image. Thirdly, the hyperbolic response of a pipeline is focused by migration, and transformed into a binary image by an iterative thresholding method. Finally, the apex of the hyperbola is employed to estimate both the horizontal position and the buried depth of the pipeline. Field experiments validated that the absolute errors of the estimated depths are less than 0.04 m and the average relative error is lower than 4 %. It is demonstrated that the proposed method is automatic, high-speed, and reliable for recognition and localization of underground pipelines in urban area.

Original languageEnglish (US)
Article number104861
JournalTunnelling and Underground Space Technology
Volume134
DOIs
StatePublished - Apr 2023

Keywords

  • Deep learning
  • Ground penetrating radar (GPR)
  • Localization
  • Non-destructive testing (NDT)
  • Underground pipeline

ASJC Scopus subject areas

  • Building and Construction
  • Geotechnical Engineering and Engineering Geology

Fingerprint

Dive into the research topics of 'Automatic recognition and localization of underground pipelines in GPR B-scans using a deep learning model'. Together they form a unique fingerprint.

Cite this