TY - JOUR
T1 - CROW
T2 - A Self-Supervised Crop Row Navigation Algorithm for Agricultural Fields
AU - Affonso, Francisco
AU - Tommaselli, Felipe Andrade G.
AU - Capezzuto, Gianluca
AU - Gasparino, Mateus V.
AU - Chowdhary, Girish
AU - Becker, Marcelo
N1 - This work was supported by S\u00E3o Paulo Research Foundation (FAPESP) grant no. 2022/08330-9, 2022/03339-8 and 2022/08433-2. Brazilian National Research Council (CNPq) grants no. 308092/2020-1.
This work was supported by S\u00E3o Paulo Research Foundation (FAPESP) grant no. 2022/08330-9, 2022/03339-8 and 2022/08433-2.
PY - 2025/3
Y1 - 2025/3
N2 - Compact robots operating beneath the crop canopy present great potential for a range of autonomous and remote tasks, including phenotyping, soil analysis, and cover cropping. Under-canopy navigation presents unique challenges, such as the need for a navigation system that can traverse diverse crop types, navigate despite sensory obstructions, and manage sensory noise effectively. Aiming to solve this problem in a scalable manner, we present a novel navigation method that uses a self-supervised neural network tailored for row-following in under-canopy plantations for mobile robots. Our method, termed CROW (Crop-ROW navigation), integrates perception, waypoint generator, and control components, and is capable of handling variations in luminosity, topology, types of plantations, and plant growth stages. By using a Deep Learning-based approach to interpret LiDAR scans, we convert the detected rows of crops into lines, establishing waypoints for the controller based on fundamental geometric principles. To address the computational complexity inherent in standard Model Predictive Controller solvers, we employ a Constrained Iterative Linear Quadratic approach. Our system has been validated in both simulated and real-world environments, demonstrating successful navigation through 115-meter corn rows with little to no intervention, i.e., requiring only 3±3 interventions per row experiment.
AB - Compact robots operating beneath the crop canopy present great potential for a range of autonomous and remote tasks, including phenotyping, soil analysis, and cover cropping. Under-canopy navigation presents unique challenges, such as the need for a navigation system that can traverse diverse crop types, navigate despite sensory obstructions, and manage sensory noise effectively. Aiming to solve this problem in a scalable manner, we present a novel navigation method that uses a self-supervised neural network tailored for row-following in under-canopy plantations for mobile robots. Our method, termed CROW (Crop-ROW navigation), integrates perception, waypoint generator, and control components, and is capable of handling variations in luminosity, topology, types of plantations, and plant growth stages. By using a Deep Learning-based approach to interpret LiDAR scans, we convert the detected rows of crops into lines, establishing waypoints for the controller based on fundamental geometric principles. To address the computational complexity inherent in standard Model Predictive Controller solvers, we employ a Constrained Iterative Linear Quadratic approach. Our system has been validated in both simulated and real-world environments, demonstrating successful navigation through 115-meter corn rows with little to no intervention, i.e., requiring only 3±3 interventions per row experiment.
KW - Deep Learning
KW - Mobile robotics
KW - Navigation
KW - Optimal Control
UR - http://www.scopus.com/inward/record.url?scp=85219750739&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85219750739&partnerID=8YFLogxK
U2 - 10.1007/s10846-025-02219-2
DO - 10.1007/s10846-025-02219-2
M3 - Article
AN - SCOPUS:85219750739
SN - 0921-0296
VL - 111
JO - Journal of Intelligent and Robotic Systems: Theory and Applications
JF - Journal of Intelligent and Robotic Systems: Theory and Applications
IS - 1
M1 - 28
ER -