TY - GEN
T1 - Sensor Localization by Few Distance Measurements via the Intersection of Implicit Manifolds
AU - Bilevich, Michael M.
AU - Lavalle, Steven M.
AU - Halperin, Dan
N1 - Funding Information:
2Center for Ubiquitous Computing, Faculty of Information Technology and Electrical Engineering, University of Oulu, Finland. Work by SL has been supported by a European Research Council Advanced Grant (ERC AdG, ILLUSIVE, 101020977) and the Academy of Finland (PERCEPT 322637).
Funding Information:
1Blavatnik School of Computer Science, Tel-Aviv University, Israel. Work by M.B. and D.H. has been supported in part by the Israel Science Foundation (grant no. 1736/19), by NSF/US-Israel-BSF (grant no. 2019754), by the Israel Ministry of Science and Technology (grant no. 103129), by the Blavatnik Computer Science Research Fund, and by the Yandex Machine Learning Initiative for Machine Learning at Tel Aviv University.
Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - We present a general approach for determining the unknown (or uncertain) position and orientation of a sensor mounted on a robot in a known environment, using only a few distance measurements (between 2 to 6 typically), which is advantageous, among others, in sensor cost, and storage and information-communication resources. In-between the measurements, the robot can perform predetermined local motions in its workspace, which are useful for narrowing down the candidate poses of the sensor. We demonstrate our approach for planar workspaces, and show that, under mild transversality assumptions, already two measurements are sufficient to reduce the set of possible poses to a set of curves (one-dimensional objects) in the three-dimensional configuration space of the sensor R2×S1, and three or more measurements reduce the set of possible poses to a finite collection of points. However, analytically computing these potential poses for non-trivial intermediate motions between measurements raises substantial hardships and thus we resort to numerical approximation. We reduce the localization problem to a carefully tailored procedure of intersecting two or more implicitly defined two-manifolds, which we carry out to any desired accuracy, proving guarantees on the quality of the approximation. We demonstrate the real-time effectiveness of our method even at high accuracy on various scenarios and different allowable intermediate motions. We also present experiments with a physical robot. Our open-source software and supplementary materials are available at https://bitbucket.org/taucgl/vb-fdml-public.
AB - We present a general approach for determining the unknown (or uncertain) position and orientation of a sensor mounted on a robot in a known environment, using only a few distance measurements (between 2 to 6 typically), which is advantageous, among others, in sensor cost, and storage and information-communication resources. In-between the measurements, the robot can perform predetermined local motions in its workspace, which are useful for narrowing down the candidate poses of the sensor. We demonstrate our approach for planar workspaces, and show that, under mild transversality assumptions, already two measurements are sufficient to reduce the set of possible poses to a set of curves (one-dimensional objects) in the three-dimensional configuration space of the sensor R2×S1, and three or more measurements reduce the set of possible poses to a finite collection of points. However, analytically computing these potential poses for non-trivial intermediate motions between measurements raises substantial hardships and thus we resort to numerical approximation. We reduce the localization problem to a carefully tailored procedure of intersecting two or more implicitly defined two-manifolds, which we carry out to any desired accuracy, proving guarantees on the quality of the approximation. We demonstrate the real-time effectiveness of our method even at high accuracy on various scenarios and different allowable intermediate motions. We also present experiments with a physical robot. Our open-source software and supplementary materials are available at https://bitbucket.org/taucgl/vb-fdml-public.
UR - http://www.scopus.com/inward/record.url?scp=85168709121&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85168709121&partnerID=8YFLogxK
U2 - 10.1109/ICRA48891.2023.10160553
DO - 10.1109/ICRA48891.2023.10160553
M3 - Conference contribution
AN - SCOPUS:85168709121
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 1912
EP - 1918
BT - Proceedings - ICRA 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 IEEE International Conference on Robotics and Automation, ICRA 2023
Y2 - 29 May 2023 through 2 June 2023
ER -