PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 1141401 (2020) https://doi.org/10.1117/12.2572635
This PDF file contains the front matter associated with SPIE Proceedings Volume 11414, including the Title Page, Copyright Information, and Table of Contents.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 1141402 (2020) https://doi.org/10.1117/12.2558729
High throughput phenotyping, including remote sensing, is enabling new approaches to both breeding and precision farming techniques that improve agricultural efficiency. Unmanned aerial vehicles (UAVs) are being widely employed to collect remote sensing data for high throughput phenotyping. This approach provides high image resolution and rapid data acquisition. However, using UAVs to collect remote sensing is a labor-intensive process as a pilot is needed for each flight. As a result, UAV based approaches face challenges in scaling data collection to large field experiments conducted across multiple geographically remote field sites. Remote sensing data collected from satellites has continually to improve with current datasets providing sub-meter spatial resolution and re-visit time as frequent as once per day. Here, we evaluate the feasibility of employing high resolution satellite imagery for phenotyping small-plot plant breeding and agronomic trials. Vegetation indices derived from satellite imagery were compared to those extracted from an UAV-based multispectral camera and the yield in a small-plot (approx. 8 sq. m) maize trial. The preliminary result indicates that there is a strong and significant correlation between data derived from satellite and UAV imagery. Satellite based phenotyping of yield trial plots would enable evaluation of new crop varieties across larger numbers of geographically distinct locations, assisting in the development of more resilient and broadly adapted crops.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 1141403 (2020) https://doi.org/10.1117/12.2560702
Crop assessment through the use of unmanned aerial systems (UAS) have increased over recent years. More farmers or their service providers have access to their own UAS, although advanced sensors and UAS platforms may not have been widely adopted. Severe tropical weather events, such as hurricanes, can have widespread negative impacts to late-season crops. The timely response necessary to detect and quantify crop damage, has led to the need for farmers and other parties to have a quick and quantifiable method to assess lodging. Most, if not all, UAS users have access to visual band color imagery. Extraction of data from this imagery as different indices and as a digital elevation model creates the opportunity to identify metrics that can detect and quantify crop lodging damage. The goal of this study was to compare multiple vegetative indices which can be calculated from RGB imagery for their ability to detect simulated crop damage. Six indices as well as a digital elevation model were extracted from UAS flights occurring over 4 weeks over a maize field. Lodging was simulated at the root and ear level with new plots being damaged at each week of treatment. Results indicated that none of the indices or extracted data examined in this study would provide information on significant differences among treatments so it would not be advised to use these metrics on their own for detecting or classifying late-season maize lodging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 1141404 (2020) https://doi.org/10.1117/12.2557686
This paper presents the assessment of lettuce plant health using unmanned aerial vehicle (UAV)-based hyperspectral sensor, proximal sensors, and measurement of agronomic & physiological parameters. Hyperspectral data of lettuce plants at Cal Poly Pomona’s Spadra Farm was collected from a DJI Matric 600 multicopter UAV. An experimental lettuce plot was designed for the study. The plot was divided into several subplots that were subject to different water and nitrogen applications with three replications. Proximal sensors included Handheld spectroradiometer, water potential meter, and chlorophyll meter. The hypespectral data from the UAV and spectroradiometer were used in the determination of several vegetation indices including normalized difference vegetation index (NDVI), water band index (WBI), and modified chlorophyll absorption ratio index (MCARI). These indices were compared with chlorophyll meter data, water potential, plant height, leaf numbers, leaf water content, and leaf nitrogen content. With the hyperspectral data collected so far, MCARI has shown good correlation with chlorophyll meter data and WBI has shown good correlation with leaf water content. The paper will show and discuss all the vegetation indices and their relationship with proximal sensor data, agronomic measurement, and leaf water & nitrogen contents.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 1141405 (2020) https://doi.org/10.1117/12.2559083
The discrimination between cotton and the invasive Palmer amaranth is economically important, as these weeds take resources away from cotton, resulting in diminished crop yield. There has been research into the discrimination between species of plants, including cotton and Palmer amaranth, that focused on the use of aerial imagery and the derived Red, Green, and near-infrared (RGN) spectral data fed into a machine-learning algorithm to classify these plants based on the measurable differences in their spectral characteristics. We believe that this research can be expanded upon by using geometric data derived from the aerial imagery to classify cotton and non-cotton plants based on their physical characteristics. This would also allow for accurate geolocation of the classified weeds for later removal. An autonomous drone with a GPS and a RGN camera attached will take a predetermined path to scan a crop field, and the resulting videos will be divided into individual frames. From these frames, both the RGN spectral data and a 3D point cloud can be derived. The RGBN data and the geometric data will be fed into a machine learning algorithm for classification between the cotton and non-cotton plants, and then additional processing will be done to geolocate the weeds. With this additional information for classification, it is hoped that the discrimination between cotton and weeds can be more accurate, and the location of the weeds can be more exact.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 1141406 (2020) https://doi.org/10.1117/12.2560596
Automating the detection of the corn tassels during owering time is important in corn breeding. To control pollination, after a tassel is visible, the plant should be checked daily for emerging ears. The conventional methods are labor-intensive and time-consuming. In this study, we developed a technique for automatic detecting and locating corn tassel in unmanned aerial vehicle (UAV) imagery with the state-of-the art Faster Region based Convolutional Neural Network (Faster R-CNN). Each raw image was divided into 1000 x 1000 pixels sub-images, and 2000 sub-images were manually annotated for tassel locations with bounding boxes as ground-truth data. 80% of the annotated sub-images were used as training data and the remaining 20% were used for testing. The performance of the trained Faster R-CNN model was evaluated by customized evaluation criteria. The model achieved good performance on tassel detection with mean average precision of 91.78% and F1 score up to 97.98%.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 1141407 https://doi.org/10.1117/12.2558249
Ground control points (GCPs) are critical for agricultural remote sensing that require georeferencing and calibration of images collected from an unmanned aerial vehicle (UAV) at different times. However, the conventional stationary GCPs are time-consuming and labor-intensive to measure, distribute, and collect information in a large field setup. An autonomous mobile GCP and a cooperation strategy to communicate with the UAV were developed to improve the efficiency and accuracy of the UAV-based data collection process. Prior to actual field testing, preliminary tests were conducted using the system to show the capability of automatic path tracking by reducing the root mean square error (RMSE) for lateral deviation from 34.3 cm to 15.6 cm based on the proposed look-ahead tracking method. The tests also indicated the feasibility of moving reflectance reference panels for every two successive flight paths without having detrimental effects on pixel values in the mosaicked images, with the percentage errors in digital number values ranging from -1.1% to 0.1%. In the actual field testing, the autonomous mobile GCP was able to successfully cooperate with the UAV in real-time without any interruption, showing superior performances for georeferencing, radiometric calibration, height calibration, and temperature calibration, compared to the conventional calibration method that has stationary GCPs.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Kyle Cheung, Alireza Pourreza, Ali Moghimi, German Zuniga-Ramirez
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 1141408 https://doi.org/10.1117/12.2557895
Almond canopy geometry has been shown to correlate with harvest yield, but the existing specialized and expensive equipment used to measure geometric features provides data limited in resolution and must be operated in a narrow time window, challenging its role in precise orchard management. To increase adoption, this study examines novel aerial data collection methods by small unmanned aerial systems (sUAS) and intuitive data processing methods with the goal of improving accuracy and reducing cost, time, and training required for canopy measurements and potential yield estimation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 1141409 (2020) https://doi.org/10.1117/12.2557899
Though unmanned aircraft systems (UAS) are widely used in agriculture, their current positioning accuracy in a radius of 0.5 to 2 meters is still too low to pinpoint a crop row or to precisely overlay temporal multi-source field maps together without a valid geometric calibration. The positioning accuracy of UAS deployed with real time kinematic (RTK) global navigation satellite system (GNSS) can be largely increased to a centimeter level, which was claimed from the manufacturers. This paper includes the preliminary test results of positioning accuracy of a commercial RTK UAS over a set of fixed position panels in our customized scenarios. Images were collected in three GNSS modes (regular GNSS without RTK, RTK mode 1 - not corrected by the positioning error of the base station, and RTK mode 2 - corrected by the positioning error of the base station) in static and in-flight settings. In the static setting, horizontal accuracies were 2.17 cm for the RTK mode 2, 12.11 cm for the RTK mode 1, and 11.46 cm for the regular GNSS mode. The significant result of horizontal accuracy in the in-flight setting was that RTK mode 2 without GCPs (2.82 cm) showed comparable accuracy with the commonly used regular GNSS mode with GCPs (1.34 cm). The vertical positioning accuracy in the static setting were 6.01 cm for the RTK mode 2, 5.65 cm for the RTK mode 1, and 10.48 cm for the regular GNSS mode. The accuracy of height measurement from digital surface models (DSMs) without and with GCPs in RTK mode 2 were 4.81 cm and 3.72 cm, respectively, which were the best performance among the three modes. In summary, the RTK UAS tested in this study showed great potential in eliminating the requirement of using GCPs and in high-positioning-accuracy application. The next phase is to test the system in field for accurate crop height measurement at different growth stages in agricultural application.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140A (2020) https://doi.org/10.1117/12.2560008
Unoccupied areal vehicles (UAVs or drones) are increasingly used in field research. Drones capable of routinely and consistently capturing high quality imagery of experimental fields have become relatively inexpensive. However, converting these images into scientifically useable data has become a bottleneck. A number of tools exist to support this work ow, but there is no framework for making these tools interopreable, sharable, and scalable. Here we present an initial draft of the Drone Processing Pipeline (DPP), a framework for processing agricultural research imagery that supports best practices and interoperability. DPP emphasizes open software and data that can be shared among and used in whole or part by the research community. We are building the DPP as a distributed, scalable, and flexible pipeline for converting drone imagery into orthomosaics, point clouds, and plot level statistics. Our intent is not to replace, but to integrate components from the emerging ecosystem of utilities with a focus on end-to-end automation and scalability. The initial focus of DPP is the measurements of experimental plots in field research. In the future we expect that standardization will enable new scientific discovery by facilitating collaboration and sharing of software and data. Our vision is to create a processing pipeline that is open, flexible, extensible, portable, and automated. With modern tools, deploying a pipeline on a laptop or HPC should only take a single command. Running a pipeline and publishing data should require only input data and a defined work flow.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Evapotranspiration and Moisture Measurement with UAS Imagery I
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140B (2020) https://doi.org/10.1117/12.2558824
Surface temperature is necessary for the estimation of energy fluxes and evapotranspiration from satellites and airborne data sources. For example, the Two-Source Energy Balance (TSEB) model uses thermal information to quantify canopy and soil temperatures as well as their respective energy balance components. While surface (also called kinematic) temperature is desirable for energy balance analysis, obtaining this temperature is not straightforward due to a lack of spatially estimated narrowband (sensor-specific) and broadband emissivities of vegetation and soil, further complicated by spectral characteristics of the UAV thermal camera. This study presents an effort to spatially model narrowband and broadband emissivities for a microbolometer thermal camera at UAV information resolution (~0.15 m) based on Landsat and NASA HyTES information using a deep learning (DL) model. The DL model is calibrated using equivalent optical Landsat / UAV spectral information to spatially estimate narrowband emissivity values of vegetation and soil in the 7–14- nm range at UAV resolution. The resulting DL narrowband emissivity values were then used to estimate broadband emissivity based on a developed narrowband-broadband emissivity relationship using the MODIS UCSB Emissivity Library database. The narrowband and broadband emissivities were incorporated into the TSEB model to determine their impact on the estimation of instantaneous energy balance components against ground measurements. The proposed effort was applied to information collected by the Utah State University AggieAir small Unmanned Aerial Systems (sUAS) Program as part of the ARS-USDA GRAPEX Project (Grape Remote sensing Atmospheric Profile and Evapotranspiration eXperiment) over a vineyard located in Lodi, California. A comparison of resulting energy balance component estimates, with and without the inclusion of high-resolution narrowband and broadband emissivities, against eddy covariance (EC) measurements under different scenarios are presented and discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140C (2020) https://doi.org/10.1117/12.2558221
Evapotranspiration (ET) estimation is important agricultural research in many regions because of the water scarcity, growing population, and climate change. ET can be analyzed as the sum of evaporation from the soil and transpiration from the crops to the atmosphere. The accurate estimation and mapping of ET are necessary for crop water management. One traditional method is to use the crop coefficient (Kc) and reference ET (ETo) to estimate actual ET. With the advent of satellite technology, remote sensing images can provide spatially distributed measurements. Satellite images are used to calculate the Normalized Difference Vegetation Index (NDVI). The relation between NDVI and Kc is used to generate a new Kc. The spatial resolution of multispectral satellite images, however, is in the range of meters, which is often not enough for crops with clumped canopy structures, such as trees and vines. Moreover, the frequency of satellite overpasses is not high enough to meet the research or water management needs. The Unmanned Aerial Vehicles (UAVs) can help mitigate these spatial and temporal challenges. Compared with satellite imagery, the spatial resolution of UAV images can be as high as centimeter-level. In this study, a regression model was developed using the Deep Stochastic Configuration Networks (DeepSCNs). Actual evapotranspiration was estimated and compared with lysimeter data in an experimental pomegranate orchard. The UAV imagery provided a spatial and tree-by-tree view of ET distribution.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140D https://doi.org/10.1117/12.2557921
In recent years, applied irrigation has been reduced to comply with California’s mandated water use restrictions. In an effort to increase water conservation, employing new technologies such as soil moisture sensors (SMS) in the agriculture system is imperative. The overall goal of this project is to estimate bermudagrass quality, managed under SMS based irrigation scheduling at Cal Poly Pomona’s Center for Turf, Irrigation and Landscape Technology (CTILT). UAV mounted Hyperspectral sensor and hand-held spectroradiometer are being used to determine vegetation indices such as water band index (WBI) and normalized difference vegetation index (NDVI) and are compared with water and chlorophyll content of bermudagrass. The UAV platform used is a multicopter, which is equipped with GPS and autopilots for autonomous flight and data capture over the turfgrass plots. The visual turf quality ratings, remote and proximal sensor data are collected once every two weeks during the growing season. The handheld spectroradiometer is a hyperspectral devise and is used to validate UAV mounted hyperspectral sensor data. A general linear model analysis of variance for a randomized complete block design will be conducted for each date to test SMS based irrigation effect on the visual ratings and clipping yield. Comparisons among visual quality ratings, percentage green cover, NDVI and WBI are analyzed with the general linear model of correlation (Pearson’s). Differences between means were separated by Fisher’s protected least significant difference (p = 0.05).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140E https://doi.org/10.1117/12.2558630
Evapotranspiration (ET) derived from remote sensing-based models represents ½ hourly to hourly value that is upscaled to a daily scale for practical applications in the fields of agricultural and water management. Several upscaling methods such as Gaussian fitting curve, sine approach, and evaporative fraction approach have been developed to extrapolate remote sensing-based ET values to daily scale by assuming constant daytime ratios (e.g., self-preservation of available energy partitioning). This simple assumption can result in uncertainties in the performance of those methods and can be violated for unstable conditions such as a cloudy day. Studies showed that, for example, diurnal variation of incoming shortwave radiation will change from a Gaussian distribution to a multimodal distribution on a cloudy day. Besides, when remote-sensing ET outputs are directly upscaled to daily scale and compared with eddy covariance measurements, a fixed footprint of eddy covariance is assumed, while the actual eddy covariance footprint is dynamic and changes with wind speed, direction and atmospheric stability. In this study, a new method is proposed to spatially and temporally simulate canopy and soil temperature for each time step (e.g., 1-hour) based on the temperature pattern recorded by IRT temperature sensors and UAV initial temperatures at the specific time of day. Next, the Two-Source Energy Balance (TSEB) model is executed for each time step of the daytime period (usually when net radiation >100 W/m^2) to calculate energy balance components. The integration of TSEB outputs over the daytime period leads to estimations of daily energy balance components. Since cloudy conditions affect temperatures recorded by IRT sensors, the proposed model is not sensitive to weather conditions. In addition, the proposed model physically simulates ET at each time step instead of directly extrapolating ET from a single remote sensing observation and model output This feature solves the limitations of comparing the direct extrapolation methods of instantaneous ET against eddy covariance measurements. The proposed approach is applied on information collected by the Utah State University AggieAir small Unmanned Aerial Systems (sUAS) Program as part of the ARS-USDA GRAPEX Project (Grape Remote sensing Atmospheric Profile and Evapotranspiration eXperiment) conducted since 2014 over multiple vineyards located in California. The estimated ET values from the TSEB model at hourly time steps and integrated over the daytime period are compared to eddy covariance measurements of ET. Additionally, hourly model output integrated over the daytime period compared to the different upscaling methods are presented and discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Evapotranspiration and Moisture Measurement with UAS Imagery II
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140F (2020) https://doi.org/10.1117/12.2558715
Estimation of surface energy fluxes using thermal remote sensing–based energy balance models (e.g., TSEB2T) involves the use of local micrometeorological input data of air temperature, wind speed, and incoming solar radiation, as well as vegetation cover and accurate land surface temperature (LST). The physically based Two-source Energy Balance with a Dual Temperature (TSEB2T) model separates soil and canopy temperature (Ts and Tc) to estimate surface energy fluxes including Rn, H, LE, and G. The estimation of Ts and Tc components for the TSEB2T model relies on the linear relationship between the composite land surface temperature and a vegetation index, namely NDVI. While canopy and soil temperatures are controlling variables in the TSEB2T model, they are influenced by the NDVI threshold values, where the uncertainties in their estimation can degrade the accuracy of surface energy flux estimation. Therefore, in this research effort, the effect of uncertainty in Ts and Tc estimation on surface energy fluxes will be examined by applying a Monte Carlo simulation on NDVI thresholds used to define canopy and soil temperatures. The spatial information used is available from multispectral imagery acquired by the AggieAir sUAS Program at Utah State University over vineyards near Lodi, California as part of the ARS-USDA Agricultural Research Service’s Grape Remote Sensing Atmospheric Profile and Evapotranspiration eXperiment (GRAPEX) project. The results indicate that LE is slightly sensitive to the uncertainty of NDVIs and NDVIc. The observed relative error of LE corresponding to NDVIs uncertainty was between -1% and 2%, while for NDVIc uncertainty, the relative error was between -2.2% and 1.2%. However, when the combined NDVIs and NDVIc uncertainties were used simultaneously, the domain of the observed relative error corresponding to the absolute values of |ΔLE| was between 0% and 4%.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140G (2020) https://doi.org/10.1117/12.2558777
Validation of surface energy fluxes from remote sensing sources is performed using instantaneous field measurements obtained from eddy covariance (EC) instrumentation. An eddy covariance measurement is characterized by a footprint function / weighted area function that describes the mathematical relationship between the spatial distribution of surface flux sources and their corresponding magnitude. The orientation and size of each flux footprint / source area depends on the micro-meteorological conditions at the site as measured by the EC towers, including turbulence fluxes, friction velocity (ustar), and wind speed, all of which influence the dimensions and orientation of the footprint. The total statistical weight of the footprint is equal to unity. However, due to the large size of the source area / footprint, a statistical weight cutoff of less than one is considered, ranging between 0.85 and 0.95, to ensure that the footprint model is located inside the study area. This results in a degree of uncertainty when comparing the modeled fluxes from remote sensing energy models (i.e., TSEB2T) against the EC field measurements. In this research effort, the sensitivity of instantaneous and daily surface energy flux estimates to footprint weight cutoffs are evaluated using energy balance fluxes estimated with multispectral imagery acquired by AggieAir sUAS (small Unmanned Aerial Vehicle) over commercial vineyards near Lodi, California, as part of the ARS-USDA Agricultural Research Service’s Grape Remote Sensing Atmospheric Profile and Evapotranspiration eXperiment (GRAPEX) project. The instantaneous fluxes from the eddy covariance tower will be compared against instantaneous fluxes obtained from different TSEB2T aggregated footprint weights (cutoffs). The results indicate that the size, shape, and weight of pixels inside the footprint source area are strongly influenced by the cutoff values. Small cutoff values, such as 0.3 and 0.35, yielded high weights for pixels located within the footprint domain, while large cutoffs, such as 0.9 and 0.95, result in low weights. The results also indicate that the distribution of modelled LE values within the footprint source area are influenced by the cutoff values. A wide variation in LE was observed at high cutoffs, such as 0.90 and 0.95, while a low variation was observed at small cutoff values, such as 0.3. This happens due to the large number of pixel units involved inside the footprint domain when using high cutoff values, whereas a limited number of pixels are obtained at lower cutoff values.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140H https://doi.org/10.1117/12.2560531
By the year 2050 the world population will increase to 9.7 billion people. Food production must increase by at least 70% in order to feed this population. One way to increase food production is to create crop cultivars that can produce high quality and high yielding crops without needing to increase the amount of resources required. Plant breeders are able to create new crop cultivars using high-throughput genotyping techniques, however the current bottleneck in plant breeding is in-field phenotyping. This study focuses on designing a high-throughput in-field proximal phenotyping system capable of collecting non-contact, high-resolution, multi-sensor, multi-view, phenomic data of vegetable plants.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140I https://doi.org/10.1117/12.2558324
We present techniques to autonomously measure crop heights in a farmland using two 2D-LiDAR mounted on an Unmanned Ground Vehicle (UGV). Knowing the height of crops is crucial for monitoring overall plant health and growth cycles. Therefore, measuring plant height is a major task in high-throughput phenotyping and is a commonly used trait in plant breeding. Conventional high-throughput height estimations rely on sensors mounted on Unmanned Aerial Vehicles, whose accuracy can be affected by the downwash due to propellers or due to distant lower resolution measurements. To achieve automated height estimation using UGV, we developed an autonomous robotic platform for high throughput phenotyping for genome wide association analysis. We develop a versatile sensing platform mounted on robots to collect large scales of data autonomously from fields. The key to our approach is autonomous row navigation capabilities that enable the robot to scan a row-based farmland without manual input. We adapt methodologies for navigable gap identification and plant heights extracting from 2D LiDAR point clouds. The key steps in our algorithm are random sample consensus (RANSAC), robot motion control, and crop height estimation. We performed a series of experiments in controlled indoor environment and natural farmland environment. Our algorithm was able to make the robot run autonomously in farmland field, and estimate the plant heights within +/- 6.57% in a dataset collected by this platform.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140J (2020) https://doi.org/10.1117/12.2557625
This paper presents the use of unmanned ground vehicle (UGV) and machine learning techniques for the identification removal of weeds in lettuce crop. In recent years, breakthroughs in deep learning, computer vision, and miniaturization of electronic devices have paved the way for use of unmanned systems and machine learning techniques for applications that are dull, dirty, and dangerous for humans including agricultural applications. Unmanned systems and machine learning techniques have potential to transform and modernize how the crops are grown and cared. One of the problems every farmer encounters is invasive weeds that can kill or hinder the growth of crops by stealing water, nutrients, and sunlight from the plants. Herbicides are used to kill and stop the growth of weeds. However, use of herbicides increases the cost of production, is labor intensive, and exposes human to dangerous chemicals. Manually removing the weeds is also very labor intensive. Using machine learning techniques and UGVs for the identification and removal of weeds will reduce the cost of production, human exposure to dangerous chemicals, and dependence on human labor. Models were trained using YOLO, Faster R-CNN, and SSD Mobile object detection techniques. For the training of machine learning models, images of the weeds in an experimental lettuce plot was collected throughout the growing season. Validation of the developed models was performed using different data sets than the training data sets in the same plot as well as a different plot. The identified weeds were then removed using the UGV through teleoperation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140K (2020) https://doi.org/10.1117/12.2558214
Soil-borne plant-parasitic nematodes exist in many soils. Some of them can cause up to 15 to 20 percent annual yield losses. Walnut has high economic value, and most edible walnuts in the US are produced in the fertile soils of the California Central Valley. Soil-dwelling nematode parasites are a significant threat, and cause severe root damage and affect the walnut yields. Early detection of plant-parasitic nematodes is critical to design management strategies. In this study, we proposed use of a new low-cost proximate radio frequency tridimensional sensor "Walabot" and machine learning classification algorithms. This pocket-sized device, unlike the remote sensing tools such as unmanned aerial vehicles (UAVs), is not limited by ight time and payload capability. It can work flexibly in the field and provide data information more promptly and accurately than UAVs or satellite. Walnut leaves from trees of different nematodes infestation levels were placed on this sensor, to test if the Walabot can detect small changes of the nematode infestation levels. Hypothetically, waveforms generated by different signals may be useful to estimate the damage caused by nematodes. Scikit-learn classification algorithms, such as Neural Networks, Random forest, Adam optimizer, and Gaussian processing, were applied for data processing. Results showed that the Walabot predicted nematodes infestation levels with an accuracy of 72% so far.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140L (2020) https://doi.org/10.1117/12.2557949
Forty percent wheat yield reduction is reported globally due to crown rot (Fusarium pseudograminearum). An emerging approach for sensor-based disease discrimination is the use of spectral reflectance with combinations of wavebands and varying bandwidths, which has potential to reduce the impact of environmental factors on spectral sensitivity detection accuracy. Transferring such technology from a laboratory to field environment presents challenges, particularly in regard to producing adequately robust models. An experiment was conducted in which near-infrared spectral reflectance data was captured in a glasshouse environment, for cultivars of spring bread wheat with varying resistances to F. pseudograminearum. A contact sensor sensitive to nearinfrared (900–1700 nm) wavebands was used. Raw sensor data was calibrated and transformed, allowing for variable waveband size. Optimised machine learning disease identification models were compared across the nine weeks following inoculation with F. pseudograminearum. Models were compared for the ability to accurately detect crown rot across weeks. The results show crown rot detection ability with accuracies ranging from 49–74%, as well as a temporal patterning effect as the season progresses. An artificial neural network classifier (ANN) performed best with a top accuracy of 74.14%, of the six machine learning algorithms trialed. Waveform differences between plus and minus treatments indicate that the sensing approach has potential to be scaled to a camera-based system for use on remote sensing platforms. Further work is being conducted to understand the viability of such an approach, which is an important step towards both robotic and RPA-based disease discrimination.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V, 114140M (2020) https://doi.org/10.1117/12.2564696
The future of phenotyping for crop breeding and agricultural production will likely involve cooperation between autonomous aerial and ground-based vehicles. Consideration of this advance in autonomous systems is relatively new in the academic literature, particularly in relation to agriculture. Areas of study to date have included the broadly applied concepts of environment perception and modeling, autonomous cooperation, collaborative position control, and path planning. Multiple opportunities are emerging for the technology to be advantageous in agriculture. An example includes using an unmanned aerial vehicle (UAV) for remote sensing in cooperation with an unmanned ground vehicle (UGV) that will perform ground-based activities in accord with analysis of the remote-sensing data. This case could be applied in mapping and mitigation of insects and weeds as well as harvesting according to variations in crop yield and maturity. Another example includes using a UGV to serve as an autonomous ground-control point in order to maximize the accuracy of UAV remote-sensing data. Ongoing research in this area has shown major improvements in the accuracy of measurements of plant reflectance, height, and temperature, not to mention improvements in georectification.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.