Abstract
In this paper we present a novel method for classifying relevant points in a sequence of images of a distant target to autonomously guide an underwater vehicle towards it. Feature points are classified by using a measure called motion perceptibility, which relates the magnitudes of the rate of change between matched feature points at different image frames (in distance), thus inherently considering the change in feature's position. This measure helps to detect which feature points are most likely to leave the field of view of the camera, thus indicating that they do not belong to the target region. By using a visual attention model adapted to underwater images, relevant points are detected and tracked using a visual servoing approach. Preliminary results on sea trials demonstrate the feasibility of our methodology.
Original language | English (US) |
---|---|
Title of host publication | OCEANS 2015 - MTS/IEEE Washington |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
ISBN (Electronic) | 9780933957435 |
State | Published - Feb 8 2016 |
Event | MTS/IEEE Washington, OCEANS 2015 - Washington, United States Duration: Oct 19 2015 → Oct 22 2015 |
Other
Other | MTS/IEEE Washington, OCEANS 2015 |
---|---|
Country/Territory | United States |
City | Washington |
Period | 10/19/15 → 10/22/15 |
ASJC Scopus subject areas
- Signal Processing
- Oceanography
- Ocean Engineering
- Instrumentation
- Acoustics and Ultrasonics