Learned Visual Navigation for Under-Canopy Agricultural Robots

Arun Narenthiran Sivakumar, Sahil Modi, Mateus Valverde Gasparino, Che Ellis, Andres Eduardo Baquero Velasquez, Girish Chowdhary, Saurabh Gupta

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper describes a system for visually guided autonomous navigation of under-canopy farm robots. Low-cost under-canopy robots can drive between crop rows under the plant canopy and accomplish tasks that are infeasible for overthe-canopy drones or larger agricultural equipment. However, autonomously navigating them under the canopy presents a number of challenges: unreliable GPS and LiDAR, high cost of sensing, challenging farm terrain, clutter due to leaves and weeds, and large variability in appearance over the season and across crop types. We address these challenges by building a modular system that leverages machine learning for robust and generalizable perception from monocular RGB images from low-cost cameras, and model predictive control for accurate control in challenging terrain. Our system, CropFollow, is able to autonomously drive 485 meters per intervention on average, outperforming a state-of-the-art LiDAR based system (286 meters per intervention) in extensive field testing spanning over 25 km.

Original languageEnglish (US)
Title of host publicationRobotics
Subtitle of host publicationScience and Systems XVII
EditorsDylan A. Shell, Marc Toussaint, M. Ani Hsieh
PublisherMIT Press Journals
ISBN (Print)9780992374778
DOIs
StatePublished - 2021
Event17th Robotics: Science and Systems, RSS 2021 - Virtual, Online
Duration: Jul 12 2021Jul 16 2021

Publication series

NameRobotics: Science and Systems
ISSN (Electronic)2330-765X

Conference

Conference17th Robotics: Science and Systems, RSS 2021
CityVirtual, Online
Period7/12/217/16/21

ASJC Scopus subject areas

  • Artificial Intelligence
  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Cite this