TY - JOUR
T1 - High precision control and deep learning-based corn stand counting algorithms for agricultural robot
AU - Zhang, Zhongzhong
AU - Kayacan, Erkan
AU - Thompson, Benjamin
AU - Chowdhary, Girish
N1 - Funding Information:
The information, data, or work presented herein was funded in part by the Advanced Research Projects Agency-Energy (ARPA-E), U.S. Department of Energy, under Award Number DE-AR0000598. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof.
Funding Information:
The information, data, or work presented herein was funded in part by the Advanced Research Projects Agency-Energy (ARPA-E), U.S. Department of Energy, under Award Number DE-AR0000598. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof. Erkan Kayacan and Zhongzhong Zhang made equal contributions to this work. Dr. Kayacan was a postdoctoral researcher in Chowdhary?s group when the bulk of this work was undertaken. He is a Lecturer at the University of Queensland since April 2019. Any questions or comments should be directed to Girish Chowdhary. We thank UIUC-IGB TERRA-MEPP team and EarthSense Inc. for valuable suggestions.
Publisher Copyright:
© 2020, Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2020/9/1
Y1 - 2020/9/1
N2 - This paper presents high precision control and deep learning-based corn stand counting algorithms for a low-cost, ultra-compact 3D printed and autonomous field robot for agricultural operations. Currently, plant traits, such as emergence rate, biomass, vigor, and stand counting, are measured manually. This is highly labor-intensive and prone to errors. The robot, termed TerraSentia, is designed to automate the measurement of plant traits for efficient phenotyping as an alternative to manual measurements. In this paper, we formulate a Nonlinear Moving Horizon Estimator that identifies key terrain parameters using onboard robot sensors and a learning-based Nonlinear Model Predictive Control that ensures high precision path tracking in the presence of unknown wheel-terrain interaction. Moreover, we develop a machine vision algorithm designed to enable an ultra-compact ground robot to count corn stands by driving through the fields autonomously. The algorithm leverages a deep network to detect corn plants in images, and a visual tracking model to re-identify detected objects at different time steps. We collected data from 53 corn plots in various fields for corn plants around 14 days after emergence (stage V3 - V4). The robot predictions have agreed well with the ground truth with Crobot= 1.02 × Chuman- 0.86 and a correlation coefficient R= 0.96. The mean relative error given by the algorithm is - 3.78 % , and the standard deviation is 6.76 %. These results indicate a first and significant step towards autonomous robot-based real-time phenotyping using low-cost, ultra-compact ground robots for corn and potentially other crops.
AB - This paper presents high precision control and deep learning-based corn stand counting algorithms for a low-cost, ultra-compact 3D printed and autonomous field robot for agricultural operations. Currently, plant traits, such as emergence rate, biomass, vigor, and stand counting, are measured manually. This is highly labor-intensive and prone to errors. The robot, termed TerraSentia, is designed to automate the measurement of plant traits for efficient phenotyping as an alternative to manual measurements. In this paper, we formulate a Nonlinear Moving Horizon Estimator that identifies key terrain parameters using onboard robot sensors and a learning-based Nonlinear Model Predictive Control that ensures high precision path tracking in the presence of unknown wheel-terrain interaction. Moreover, we develop a machine vision algorithm designed to enable an ultra-compact ground robot to count corn stands by driving through the fields autonomously. The algorithm leverages a deep network to detect corn plants in images, and a visual tracking model to re-identify detected objects at different time steps. We collected data from 53 corn plots in various fields for corn plants around 14 days after emergence (stage V3 - V4). The robot predictions have agreed well with the ground truth with Crobot= 1.02 × Chuman- 0.86 and a correlation coefficient R= 0.96. The mean relative error given by the algorithm is - 3.78 % , and the standard deviation is 6.76 %. These results indicate a first and significant step towards autonomous robot-based real-time phenotyping using low-cost, ultra-compact ground robots for corn and potentially other crops.
KW - Agricultural robotics
KW - Corn stand counting
KW - Deep learning
KW - Field robot
KW - High precision control
KW - Machine learning
UR - http://www.scopus.com/inward/record.url?scp=85088980162&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85088980162&partnerID=8YFLogxK
U2 - 10.1007/s10514-020-09915-y
DO - 10.1007/s10514-020-09915-y
M3 - Article
AN - SCOPUS:85088980162
SN - 0929-5593
VL - 44
SP - 1289
EP - 1302
JO - Autonomous Robots
JF - Autonomous Robots
IS - 7
ER -