TY - GEN
T1 - Learned Visual Navigation for Under-Canopy Agricultural Robots
AU - Sivakumar, Arun Narenthiran
AU - Modi, Sahil
AU - Gasparino, Mateus Valverde
AU - Ellis, Che
AU - Velasquez, Andres Eduardo Baquero
AU - Chowdhary, Girish
AU - Gupta, Saurabh
N1 - Funding Information:
This paper was supported in part by NSF STTR #1820332, USDA/NSF CPS project #2018-67007-28379, USDA/NSF AIFARMS National AI Institute USDA #02067021-32799/project accession no. 1024178, NSF IIS #2007035, and DARPA Machine Common Sense. We thank Earthsense Inc. for the robots used in this work and we thank the Department of Agricultural and Biological Engineering and Center for Digital Agriculture (CDA) at UIUC for the Illinois Autonomous Farm (IAF) facility used for data collection and field validation of CropFollow. We thank Vitor Akihiro H. Higuti and Sri Theja Vuppala for their help in integration of CropFollow on the robot and field validation.
Funding Information:
This paper was supported in part by NSF STTR #1820332, USDA/NSF CPS project #2018-67007-28379, USDA/NSF AIFARMS National AI Institute USDA #020-67021-32799/project accession no. 1024178, NSF IIS #2007035, and DARPA Machine Common Sense. We thank Earthsense Inc. for the robots used in this work and we thank the Department of Agricultural and Biological Engineering and Center for Digital Agriculture (CDA) at UIUC for the Illinois Autonomous Farm (IAF) facility used for data collection and field validation of CropFollow. We thank Vitor Akihiro H. Higuti and Sri Theja Vuppala for their help in integration of CropFollow on the robot and field validation.
Publisher Copyright:
© 2021, MIT Press Journals, All rights reserved.
PY - 2021
Y1 - 2021
N2 - This paper describes a system for visually guided autonomous navigation of under-canopy farm robots. Low-cost under-canopy robots can drive between crop rows under the plant canopy and accomplish tasks that are infeasible for overthe-canopy drones or larger agricultural equipment. However, autonomously navigating them under the canopy presents a number of challenges: unreliable GPS and LiDAR, high cost of sensing, challenging farm terrain, clutter due to leaves and weeds, and large variability in appearance over the season and across crop types. We address these challenges by building a modular system that leverages machine learning for robust and generalizable perception from monocular RGB images from low-cost cameras, and model predictive control for accurate control in challenging terrain. Our system, CropFollow, is able to autonomously drive 485 meters per intervention on average, outperforming a state-of-the-art LiDAR based system (286 meters per intervention) in extensive field testing spanning over 25 km.
AB - This paper describes a system for visually guided autonomous navigation of under-canopy farm robots. Low-cost under-canopy robots can drive between crop rows under the plant canopy and accomplish tasks that are infeasible for overthe-canopy drones or larger agricultural equipment. However, autonomously navigating them under the canopy presents a number of challenges: unreliable GPS and LiDAR, high cost of sensing, challenging farm terrain, clutter due to leaves and weeds, and large variability in appearance over the season and across crop types. We address these challenges by building a modular system that leverages machine learning for robust and generalizable perception from monocular RGB images from low-cost cameras, and model predictive control for accurate control in challenging terrain. Our system, CropFollow, is able to autonomously drive 485 meters per intervention on average, outperforming a state-of-the-art LiDAR based system (286 meters per intervention) in extensive field testing spanning over 25 km.
UR - http://www.scopus.com/inward/record.url?scp=85119565753&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85119565753&partnerID=8YFLogxK
U2 - 10.15607/RSS.2021.XVII.019
DO - 10.15607/RSS.2021.XVII.019
M3 - Conference contribution
AN - SCOPUS:85119565753
SN - 9780992374778
T3 - Robotics: Science and Systems
BT - Robotics
A2 - Shell, Dylan A.
A2 - Toussaint, Marc
A2 - Hsieh, M. Ani
PB - MIT Press Journals
T2 - 17th Robotics: Science and Systems, RSS 2021
Y2 - 12 July 2021 through 16 July 2021
ER -