Development of a mobile robotic phenotyping system for growth chamber-based studies of genotype x environment interactions

Thumbnail Image
Date
2016-01-01
Authors
Shah, Dylan
Major Professor
Advisor
Lie Tang
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Altmetrics
Authors
Research Projects
Organizational Units
Journal Issue
Is Version Of
Versions
Series
Department
Agricultural and Biosystems Engineering
Abstract

In order to fully understand the interaction between phenotype and genotype x environment to improve crop performance, a large amount of phenotypic data is needed. Studying plants of a given strain under multiple environments can greatly help to reveal their interactions. This thesis presents two key portions of the development of the Enviratron rover, a robotic system that aims to autonomously collect the labor-intensive data required to perform experiments in this area. The rover is part of a larger project which will track plant growth in multiple environments.

The first aspects of the robot discussed in this thesis is the system hardware and main, or whole-chamber, imaging system. Semi-autonomous behavior is currently achieved, and the system performance in probing leaves is quantified and discussed. In contrast to existing systems, the rover can follow magnetic tape along all four directions (front, left, back, right), and uses a Microsoft Kinect V2 mounted on the end-effector of a robotic arm to position a threaded rod, simulating future sensors such as fluorimeter and Raman Spectrometer, at a desired position and orientation. Advantages of the tape following include being able to reliably move both between chambers and within a chamber regardless of dust and lighting conditions. The robot arm and Kinect system is unique in its speed at reconstructing an (filtered) environment when combined with its accuracy at positioning sensors. A comparison of using raw camera coordinates data and using KinectFusion data is presented. The results suggest that the KinectFusion pose estimation is fairly accurate, only decreasing accuracy by a few millimeters at distances of roughly 0.8 meter. The system can consistently position sensors to within 4 cm of the goal, and often within 3 cm. The system is shown to be accurate enough to position sensors to à ± 9 degrees of a desired orientation, although currently this accuracy requires human input to fully utilize the Kinect’s feedback.

The second aspect of the robot presented in this thesis is a framework for generating collision-free robot arm motion within the chamber. This framework uses feedback from the Kinect sensor and is based on the Probabilistic Roadmaps (PRM) technique, which involves creating a graph of collision-free nodes and edges, and then searching for an acceptable path. The variant presented uses a dilated, down-sampled, KinectFusion as input for rapid collision checking, effectively representing the environment as a discretized grid and representing the robot arm as a collection of spheres. The approach combines many desirable characteristics of previous PRM methods and other collision-avoidance schemes, and is aimed at providing a reliable, rapidly-constructed, highly-connected roadmap which can be queried multiple times in a static environment, such as a growth chamber or a greenhouse. In a sample plant configuration with several of the most challenging practical goal poses, it is shown to create a roadmap in an average time of 32.5 seconds. One key feature is that nodes are added near the goal during each query, in order to increase accuracy at the expense of increased query time. A completed graph is searched for an optimal path connecting nodes near the starting pose and the desired end pose. The fastest graph search studied was an implementation of the A* algorithm. Queries using this framework took an average time of 0.46 seconds. The average distance between the attained pose and the desired location was 2.7 cm. Average distance C-space between the attained pose and the desired location was 3.65 degrees.

The research suggests that the robotic framework presented has the potential to fulfill the main hardware and motion requirements of an autonomous indoor phenotyping robot, and can generate desired collision-free robot arm motion.

Comments
Description
Keywords
Citation
Source
Copyright
Fri Jan 01 00:00:00 UTC 2016