Degree Type

Thesis

Date of Award

2020

Degree Name

Doctor of Philosophy

Department

Agricultural and Biosystems Engineering

Major

Agricultural and Biosystems En gineering

First Advisor

Lie Tang

Abstract

Field-based phenotyping heavily relies on infield manual measurement, which is labor-intensive, repetitive, and time-consuming. With the rapid advancements of robotic technology, automated in-field phenotyping technologies can significantly increase data throughput and reduce labor demand. A robotic mobile platform PhenoBot 3.0 was designed by our research group to traverse between crop rows and acquire phenotypic data automatically. However, the field-based navigation control is a critical and challenging task due to the complex and unstructured/semi-structured environment in the field. This dissertation documents our investigation of a field-based navigation control system for an agricultural field robotic vehicle. Different functional modules were developed and implemented for the system, including the motion control module based on robot kinetic model, the robot localization module using a single RTK-GPS receiver, the path tracking module running different tracking algorithms, and the computer vision-based row mapping and in-field localization module using different sensor setups. Path tracking based on GPS localization is the most common navigation strategy for agricultural robotic vehicles. Three specific path tracking algorithms including Linear-Quadratic Regulator (LQR), Pure Pursuit control (PPC) and Timed Elastic Band (TEB) were implemented. The performance of the proposed navigation control systems were assessed on our PhenoBot 3.0 platform under both simulated and real field conditions. Satisfactory accuracies in terms of the mean absolute tracking error (MATE) were achieved while running the LQR controller on our proposed navigation control system in both simulation and field tests. The results showed the proposed navigation control system is capable of guiding the PhenoBot 3.0 robot to follow predefined paths to traverse between crop rows on uneven terrain. For situations where global localization is denied or a pre-defined path is not available, computer vision was applied to detect the crop rows in order to locate the robot, create field maps, and navigate the robot through row-guidance. A vision-based system using a Time-of-Flight (ToF) camera was developed for under-canopy navigation, specifically for crop row mapping and robot localization under canopies of the crop rows. The potential and limitations of using ToF cameras for under-canopy navigation were investigated through field tests. Since the agronomically-spaced crop rows are well-constructed in parallel and lend to unique features in the frequency domain, Discrete Fourier Transform (DFT) can be potentially used to solve crop row detection problems of robot navigation in agriculture. A novel image processing pipeline was developed to detect crop rows from top-view color images using frequency domain analysis. A Linear Quadratic Gaussian (LQG) controller was used with the proposed algorithm for robot navigation between crop rows. The field tests showed that the proposed crop row detection algorithm was capable of detecting crop rows with plants at different growth stages and under variable illumination conditions; and the algorithm is feasible for navigation control using a row-tracking strategy.

DOI

https://doi.org/10.31274/etd-20210114-50

Copyright Owner

Jingyao Gai

Language

en

File Format

application/pdf

File Size

126 pages

Available for download on Wednesday, July 07, 2021

Share

COinS