Visual–inertial displacement sensing using data fusion of vision-based displacement with acceleration

Jong Woong Park, Do Soo Moon, Hyungchul Yoon, Fernando Gomez, Billie F. Spencer, Jong R. Kim

Research output: Contribution to journalArticlepeer-review


In recognition of the importance of the displacement associated with assessing structural condition, many displacement measurement methods have been proposed to date. With advances in optics and electronics, displacement measurement relying on computer-vision techniques to convert pixel movement into structural displacement has drawn much attention recently, thanks to its simplicity in installation and relatively inexpensive cost. Despite numerous advantages, 2 major obstacles that prohibit the use of vision-based method are (a) resolution, which is a function of distance between the camera and the structure, and (b) limited frame rate, which both lower dynamic displacement-capturing capability. In this paper, to enhance the quality of vision-based displacement measurement, data fusion with acceleration measurement is proposed to improve the dynamic range of displacements while lowering signal noise. To achieve fusion between vision-based displacement and acceleration, complementary filters and a time synchronization method between 2 different sources were proposed. The proposed methods were verified through numerical analysis and an experimental test, the results of which showed the validity of proposed data fusion.

Original languageEnglish (US)
Article numbere2122
JournalStructural Control and Health Monitoring
Issue number3
StatePublished - Mar 2018


  • complementary filter
  • computer vision
  • displacement measurement
  • sensor fusion
  • structural health monitoring

ASJC Scopus subject areas

  • Civil and Structural Engineering
  • Building and Construction
  • Mechanics of Materials


Dive into the research topics of 'Visual–inertial displacement sensing using data fusion of vision-based displacement with acceleration'. Together they form a unique fingerprint.

Cite this