TY - GEN
T1 - Joint GPS and vision direct position estimation
AU - Ng, Yuting
AU - Gao, Grace Xingxin
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/5/26
Y1 - 2016/5/26
N2 - GPS and vision position sensing are complementary. Under open sky environments, GPS sensing is superior due to unimpeded strong signal reception while vision sensing suffers from the lack of unique features. In urban settings, however, vision sensing becomes superior with the abundance of unique characterizing vision features while the GPS sensing performance is hindered by obstruction and multipath. To better leverage upon the complementary nature of the two sensing modes, we propose the Joint GPS and Vision Direct Positioning (GPS+V DP). GPS+V DP achieves meaningful integration between the two sensing modes, directly estimating the receiver position from the entire raw GPS signal and vision features extracted from the raw camera image. GPS+V DP consists of two synchronized lines of processing: GPS Direct Positioning (GPS DP) and Vision Direct Positioning (Vision DP). GPS DP searches for the composite signal replica that gives the highest correlation against the observed GPS signal. This best matched composite signal replica is most likely generated from the most optimal receiver parameters of 3D position, clock bias, 3D velocity and clock drift. Vision DP searches for the geo-tagged reference image that gives the lowest composite feature distance against the observed image. This best matched reference image is most likely generated from the most optimal camera latitude, longitude, heading and tilt. The measurements from both GPS DP and Vision DP are concatenated and used to directly estimate and track the sensors' position parameters. We implemented GPS+V DP using our research platform (PyGNSS) and an open source computer vision library (OpenCV). We tested our GPS+V DP receiver architecture using experimental data collected on campus. We demonstrate the functionality of our algorithm through our experimental results.
AB - GPS and vision position sensing are complementary. Under open sky environments, GPS sensing is superior due to unimpeded strong signal reception while vision sensing suffers from the lack of unique features. In urban settings, however, vision sensing becomes superior with the abundance of unique characterizing vision features while the GPS sensing performance is hindered by obstruction and multipath. To better leverage upon the complementary nature of the two sensing modes, we propose the Joint GPS and Vision Direct Positioning (GPS+V DP). GPS+V DP achieves meaningful integration between the two sensing modes, directly estimating the receiver position from the entire raw GPS signal and vision features extracted from the raw camera image. GPS+V DP consists of two synchronized lines of processing: GPS Direct Positioning (GPS DP) and Vision Direct Positioning (Vision DP). GPS DP searches for the composite signal replica that gives the highest correlation against the observed GPS signal. This best matched composite signal replica is most likely generated from the most optimal receiver parameters of 3D position, clock bias, 3D velocity and clock drift. Vision DP searches for the geo-tagged reference image that gives the lowest composite feature distance against the observed image. This best matched reference image is most likely generated from the most optimal camera latitude, longitude, heading and tilt. The measurements from both GPS DP and Vision DP are concatenated and used to directly estimate and track the sensors' position parameters. We implemented GPS+V DP using our research platform (PyGNSS) and an open source computer vision library (OpenCV). We tested our GPS+V DP receiver architecture using experimental data collected on campus. We demonstrate the functionality of our algorithm through our experimental results.
UR - http://www.scopus.com/inward/record.url?scp=84978501323&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84978501323&partnerID=8YFLogxK
U2 - 10.1109/PLANS.2016.7479724
DO - 10.1109/PLANS.2016.7479724
M3 - Conference contribution
AN - SCOPUS:84978501323
T3 - Proceedings of the IEEE/ION Position, Location and Navigation Symposium, PLANS 2016
SP - 380
EP - 385
BT - Proceedings of the IEEE/ION Position, Location and Navigation Symposium, PLANS 2016
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - IEEE/ION Position, Location and Navigation Symposium, PLANS 2016
Y2 - 11 April 2016 through 14 April 2016
ER -