TY - GEN
T1 - WayFASTER
T2 - 2024 IEEE International Conference on Robotics and Automation, ICRA 2024
AU - Gasparino, Mateus V.
AU - Sivakumar, Arun N.
AU - Chowdhary, Girish
N1 - This work was supported in part by NIFA #2021-67021-33449, NIFA #2021-67021-34418, and AFRI grant #2020-67021-32799/project accession no.1024178 NSF/USDA National AI institute: AIFARMS.
PY - 2024
Y1 - 2024
N2 - Accurate and robust navigation in unstructured environments requires fusing data from multiple sensors. Such fusion ensures that the robot is better aware of its surroundings, including areas of the environment that are not immediately visible but were visible at a different time. To solve this problem, we propose a method for traversability prediction in challenging outdoor environments using a sequence of RGB and depth images fused with pose estimations. Our method, termed WayFASTER (Waypoints-Free Autonomous System for Traversability with Enhanced Robustness), uses experience data recorded from a receding horizon estimator to train a self-supervised neural network for traversability prediction, eliminating the need for heuristics. Our experiments demonstrate that our method excels at avoiding obstacles, and correctly detects that traversable terrains, such as tall grass, can be navigable. By using a sequence of images, WayFASTER significantly enhances the robot's awareness of its surroundings, enabling it to predict the traversability of terrains that are not immediately visible. This enhanced awareness contributes to better navigation performance in environments where such predictive capabilities are essential.
AB - Accurate and robust navigation in unstructured environments requires fusing data from multiple sensors. Such fusion ensures that the robot is better aware of its surroundings, including areas of the environment that are not immediately visible but were visible at a different time. To solve this problem, we propose a method for traversability prediction in challenging outdoor environments using a sequence of RGB and depth images fused with pose estimations. Our method, termed WayFASTER (Waypoints-Free Autonomous System for Traversability with Enhanced Robustness), uses experience data recorded from a receding horizon estimator to train a self-supervised neural network for traversability prediction, eliminating the need for heuristics. Our experiments demonstrate that our method excels at avoiding obstacles, and correctly detects that traversable terrains, such as tall grass, can be navigable. By using a sequence of images, WayFASTER significantly enhances the robot's awareness of its surroundings, enabling it to predict the traversability of terrains that are not immediately visible. This enhanced awareness contributes to better navigation performance in environments where such predictive capabilities are essential.
UR - http://www.scopus.com/inward/record.url?scp=85202441976&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85202441976&partnerID=8YFLogxK
U2 - 10.1109/ICRA57147.2024.10610436
DO - 10.1109/ICRA57147.2024.10610436
M3 - Conference contribution
AN - SCOPUS:85202441976
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 8486
EP - 8492
BT - 2024 IEEE International Conference on Robotics and Automation, ICRA 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 13 May 2024 through 17 May 2024
ER -