PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 1305301 (2024) https://doi.org/10.1117/12.3036723
This PDF file contains the front matter associated with SPIE Proceedings Volume 13053, including the Title Page, Copyright information, Table of Contents, and Conference Committee information.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
UAV-based Sensing Systems for Phenotyping and Precision Agriculture
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 1305302 (2024) https://doi.org/10.1117/12.3013542
The plant breeding community has increasingly adopted remote sensing platforms like unmanned aerial vehicles (UAVs) to collect crop phenotype data. These platforms captured high-resolution multi-spectral (MS) image data during extensive field trials, enabling concurrent evaluation of thousands of plots with diverse seed varieties and management practices. However, breeders still relied on manual and intricate data extraction, processing, and analysis of high-resolution imagery for drawing results. A significant challenge was identifying plot locations within MS imagery, delineating plot boundaries, and providing spatial references to each plot. Therefore, the study was conducted to develop an automated methodology for creating multiple polygon shape files with unique identifiers for overlaying drone imagery data with centimeter-level accuracy for obtaining zonal statistics of each phenotype plot. The goal was to develop a pipeline without assuming field uniformity, plot spacing, size, or quantity, minimizing the need for manual adjustments. The proposed method used a highaccuracy planter-logged Real-Time Kinematic Global Positioning System (RTK-GPS) and georeferenced MS-UAV imagery. This process automatically generated plot boundaries by converting RTK-GPS points to polygons representing each planted plot. The resulting pipeline automatically produced maps of Vegetative Indices (VI), multi-polygon shape files, and CSV files of plot boundaries for external software and downstream analysis. Notably, the polygon shape file consistently aligned with plot boundaries within the image and even across temporal data sets with the least manual adjustments. This approach provided an efficient, adaptable, and replicable automated solution, minimizing time, labor, and user involvement while facilitating zonal statistics extraction of each phenotype plot from MS imagery.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 1305303 (2024) https://doi.org/10.1117/12.3021199
Recent advancements in sensor technologies make it possible to collect fine spatial and high temporal resolution remote sensing data and automatically extract informative features in a high throughput mode. As researchers increasingly have access to tools to collect big data, such as Unmanned Aerial Vehicles (UAV) and Controlled Environment Phenotyping Facility (CEPF), there is a need for generating quantitative phenotypic from the collected geospatial data. While precision agriculture technology aims to protect our environment and produce enough food to feed a growing population, the massive volume of geospatial data generated by the research scientists and the lack of software packages customized for processing these data make it challenging to develop transdisciplinary research collaboration around this data. We will share our efforts to develop an open-source online platform for UAS HTP data management to address the big data challenges.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Advanced UAV and UGV Sensors for Specialty Crop Applications
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 1305304 (2024) https://doi.org/10.1117/12.3021219
This study addresses the escalating threat of branched broomrape (Phelipanche ramosa) to California's tomato industry, which provides over 90% of the United States processing tomatoes. The parasite's life cycle, largely underground and therefore invisible until advanced infestation, presents a significant challenge to both detection and management. Conventional chemical control strategies, while widespread, are costly, environmentally detrimental, and often ineffective due to the parasite's subterranean nature and the indiscriminate nature of the treatments. Innovative strategies employing advanced remote sensing technologies were explored, integrating drone-based multispectral imagery with cutting-edge Long Short-Term Memory (LSTM) deep learning networks and utilizing Synthetic Minority Over-sampling Technique (SMOTE) to address the imbalance between healthy and diseased plant samples in the data. The research was conducted on a known broomrape-infested tomato farm in Woodland, Yolo County, California. Data were meticulously gathered across five key growth stages determined by growing degree days (GDD), with multispectral images processed to isolate tomato canopy reflectance. Our findings revealed that the earliest growth stage at which broomrape could be detected with acceptable Accuracy was at 897 GDD, achieving an overall Accuracy of 79.09% and a recall rate for broomrape of 70.36%, without the integration of subsequent growing stages. However, when considering sequential growing stages, the LSTM models applied across four distinct scenarios with and without SMOTE augmentation indicated significant improvements in the identification of broomrape-infested plants. The best-performing scenario, which integrated all growth stages, achieved an overall Accuracy of 88.37% and a Recall rate of 95.37%. These results demonstrate the LSTM network's robust potential for early broomrape detection and highlight the need for further data collection to enhance the model's practical application. Looking ahead, the study's approach promises to evolve into a valuable tool for precision agriculture, potentially revolutionizing the management of crop diseases and supporting sustainable farming practices.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 1305305 (2024) https://doi.org/10.1117/12.3014667
Traditional methods of powdery mildew (PM) detection involve visual inspection. However, the PM symptoms must already be visible when the damage is already done and disease is already spreading. Laboratory tests are more accurate than visual inspection, but are time consuming and cannot provide information for immediate decision making. Near infrared (NIR) and shortwave infrared (SWIR) sensors can see the reflected light in 700 nm to 2500 nm spectral range, thereby helping early detection of PM and other diseases. When subjected to PM stress, grapes undergo changes in spectral reflectance due to physiological and biochemical alterations in their leaves, such as decreased chlorophyll content, destroyed cell structure, or water stress. This paper presents an investigation on the potential of hyperspectral data acquired from vineyards using unmanned aerial vehicles (UAVs) in detecting powdery mildew in grapes. A UAV equipped with a hyperspectral sensor has been flown over a Cal Poly Pomona vineyard. The hyperspectral data is used to determine various vegetation indices including normalized difference texture index (NDTI), powdery mildew index (PMI), and normalized difference water index (NDWI) that can provide information on the presence of the disease and plant stresses due to the disease. These indices are compared with the ground-truth data that include visual inspection data and proximal sensor data such as chlorophyll meter and NDVI meter.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 1305306 (2024) https://doi.org/10.1117/12.3021150
Tracking plants is an important part of growing and distributing ornamental plants. One method of inventory of these plants is to use Radio Frequency Identification (RFID) tags. The large areas of plant nurseries require an effective and efficient method of reading the RFID tags. This paper investigates the utilization of RFID with Unmanned Aerial Vehicles (UAV), specifically how the altitude and antenna power affect the accuracy of the tag counts. There were three power levels (15 dBm, 20 dBm and 27 dBm) for the RFID antenna and three flight levels (3 m, 5 m and 7 m) used in the experiment setup. Two plant types, namely ‘Green Giant’ arborvitae and ‘Sky Pencil’ holly, were also used, set on two separate plots. Four RFID tag types were utilized (L5, L6, L8 and L9) with two antenna types (dog bone and square wave) and two attachment mechanisms (loop-lock and stake). The UAV is flown to three different flight levels for each power level. For each flight level, two passes are performed. Plants were tagged randomly with 40 plants and tags for each plot. Experiments were conducted on September 9, 21-22, October 19, November 1-2, 27 and December 14, 2023 at the Dudley Nurseries in Thomson, GA. At 15 dBm power level, it yielded a tag count accuracy of 53%, 34% and 16% at flight levels of 3 m, 5 m and 7m, respectively. Increasing the power level to 20 dBm at flight level yielded a count accuracy of 90% across all tag types and plant types. At higher flight levels of 5 m and 7 m, accuracy drops to 75% and 33%, respectively, at power levels of 20 dBm and even lower at 15 dBm. The highest country accuracy was achieved at a power level of 27 dBm at a flight level of 3 m at 98%. Of the four tag types, L6 and L9 have the highest accuracy at any flight level and power setting.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 1305307 (2024) https://doi.org/10.1117/12.3014678
Weeds significantly impact agriculture, leading to substantial global crop losses and increased production costs. Conventional weed management methods, such as manual labor and herbicide use, are often labor-intensive and environmentally concerning. This paper explores the integration of unmanned aerial vehicles (UAVs) and advanced machine learning techniques, with a specific focus on deep learning and computer vision, for the precise detection and localization of weeds. The research involves the ongoing collection of a comprehensive dataset of weed images from a strawberry field over the entire growing season. Machine learning models, including YOLO, Faster R-CNN, and SSD object detection, are currently being trained on this dataset to accurately detect and localize weeds from a UAV altitude of 10 meters. The validation of these models will be conducted using both the same strawberry fields with new batches of transplantations, ensuring the robustness and generalizability of the detection techniques. The effectiveness of each model is compared with each other to highlight the inherent strengths and weaknesses of each in an agricultural setting. The identified weeds are localized using a UAV in real-time, and explore monocular and stereo vision as imaging techniques. This integrated approach holds the potential for several advantages that include reducing the cost of production, minimizing human exposure to harmful chemicals, and decreasing the reliance on manual labor for weed management. The significance of this ongoing research lies in its potential to revolutionize weed management by providing a reliable and efficient method for weed detection and localization. The study aims to contribute empirical evidence and data, bridging the gap between theoretical frameworks and practical implementation in precision agriculture.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 1305308 (2024) https://doi.org/10.1117/12.3013889
With an increasing focus on precision agriculture to maximize crop yields and minimize ecological impacts, remote sensing for agriculture has required the deployment of more advanced sensors and processing algorithms. Traditionally, unmanned aerial systems (UAS’s) have been the primary choice for phenotyping crops, but these systems are limited in endurance, power, payload, and legality. Medium to large, unmanned ground vehicles (UGV’s), however, are not hampered by these limitations. Previous research in the application of phenotyping UGV’s for tall crops has been focused on either small systems or very large gantry systems. Described here is a medium sized, low-cost, adjustable UGV that provides a solution by demonstrating the capability to image tall crops into late growth states. It incorporates a sliding mechanism to allow for a greater range in height than previous phenotyping UGV’s with the same payload capacity.The UGV’s capabilities are analyzed theoretically and practically, including its structural rigidity, handling, and endurance. An overview of parts and assembly is presented to facilitate replication and proliferation of the vehicle. Additionally, the vehicle is primarily fabricated using off-the-shelf components. The few custom components are used based on common materials and simple geometries and can be replicated with standard metalworking equipment. To further reduce costs, a dual RTKGNSS system is utilized to control the UGV in a semi-autonomous fashion. A Future goal is to use the gathered datasets to produce an algorithm for fully autonomous capabilities.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 1305309 (2024) https://doi.org/10.1117/12.3013894
On the epidermal surface of plants, a kidney-shaped organ called stomata plays an important role in plant health under drought conditions. These stomata which resembles to be like a pore, opens and closes during transpiration. The lower the stomatal transpiration in drought, plants can escape drought when the rate of photosynthesis is balanced. A greater number of open stomata indicates that plants are experiencing drought. To assess stomata response, one must derive the pore aperture ratio. The lower the pore aperture ratio of plants, the more closed stomata are in response to drought, consequently decreasing transpiration. Here we show the development and implementation of StomaDetectv1, a novel deep learning model for non-destructive, high-throughput phenotyping of corn stomata, utilizing a custom Faster R-CNN architecture. StomaDetectv1 achieves an Average Precision of 84.988% for closed stomata areas, underpinning its efficacy in identifying variations in stomatal traits. The model was adept at assessing stomatal density and aperture ratios, essential for quantifying drought resilience. This work underscores the significance of integrating imaging techniques and deep learning for precision phenotyping, offering a scalable solution for monitoring plant circadian rhythm, and aiding in the breeding of drought-resistant crops. By furnishing breeders and geneticists with detailed insights into stomatal behavior, our approach catalyzes the development of corn varieties optimized for water use efficiency and yield under drought conditions, thereby advancing agricultural practices to combat climate challenges.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 130530A (2024) https://doi.org/10.1117/12.3021157
Small unmanned aircraft systems (UAS) are increasingly used for remote sensing applications in precision agriculture due to their ability to collect high-resolution imagery. However, spatial calibration of UAS imagery is often a manual process that requires extensive planning and post-processing, presenting bottlenecks for automating image analysis workflows. This study seeks to address bottlenecks in the photogrammetry workflow that arise from manually tagging ground control points (GCPs) by automating the process. The main objectives included investigating (1) the application of data compression techniques for global navigation satellite system (GNSS) coordinates in generating matrix barcode representations and (2) the recovery of GNSS coordinates from matrix barcodes using a small UAS. GNSS coordinates were compressed using a base-36 encoding schema and encoded into QR code GCPs to reduce the number of alphanumeric characters required. Preliminary in-field testing demonstrated the reliability of recovering QR code GCPs from aerial imagery across various altitudes and exposure settings, with adjustments in exposure compensation mitigating altitude-related recoverability issues. Moreover, results indicated that the processing of aerial imagery into orthomosaic images did not compromise QR code recoverability. Further in-field testing identified QR code GCP background color as a key factor influencing recoverability, with darker colors generally improving recoverability. Statistical analysis validated altitude and background color as significant predictors of QR code GCP recoverability. Future research avenues include incorporating environmental factors such as solar radiation to improve statistical model fit. Overall, QR code GCPs offer a potential approach for automating photogrammetry workflows, reducing both time and labor associated with manual tagging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 130530B (2024) https://doi.org/10.1117/12.3013501
In recent times, precision agriculture, an approach that utilizes scientific and technological advancements and techniques for the enhancement of agricultural production, usually starts with the crop line detection procedure. Crop line detection helps precision agriculture with the mapping of the crop fields, which is useful for agricultural resources (water, fertilizer, pesticides, etc.) management, crop yield estimation, autonomous harvesting and irrigation management, disease and pest control, weed detection, controlled monitoring by autonomous machines and so forth. Although the aim of crop line detection in this inquiry is weed detection, which can aid the farmers regarding the optimum usage of herbicides in the field, it can be extended to any precision agriculture study. In this study, two different methods are employed for crop line detection: Hough transformation and Pixel/Frequency counting. The study was conducted on a 1.2-ha corn field through 2020 - 2023 that covers the crop period of corn (April ∼ August). More than 7000 high-spatial-resolution RGB images are collected using a GoPro camera attached to a custom-made unmanned aerial vehicle. Around 10% of these images are randomly selected for this analysis. RGB image frames were extracted from the video files and organized according to their weekly growth timeline. Normalized Excess Green Vegetation Index is calculated to convert them into two-level binary images. 2D Fourier transform is used to find the average crop line angle. Comparing the crop lines detected from both procedures with the actual crop lines present in the respective image frame, confusion matrix information is constructed for the performance evaluation. The average accuracy of crop line detection found for Hough transformation is 87.79%, and for Pixel counting, it is 95.71%, which can be promising choices to be employed for crop line detection.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 130530C (2024) https://doi.org/10.1117/12.3021207
Citrus greening disease (HLB) and citrus canker are diseases afflicting Florida citrus groves, causing financial losses through smaller fruits, blemishes, premature fruit drop and/or eventual tree death. Often, symptoms of these resemble those of other defects/infections. Early detection of HLB and canker via in-grove leaf inspection can permit more effective mitigation tactics and more intelligent management of groves. Autonomous, vision-based disease scouting in a grove offers a financial benefit to the Florida citrus industry. This study investigates the potential of hyperspectral reflectance imagery (HSI) for detecting and classifying these conditions in the presence of other, less consequential leaf defects. Both sides of leaves with visible symptoms of HLB, canker, zinc deficiency, scab, melanose, greasy spot, and a control set were collected and imaged with a line-scan hyperspectral camera. Spectral bands from this imagery were selected using two methods: an unsupervised method based on principal component analysis (PCA), and a supervised method based on linear discriminant analysis (LDA). Using the selected bands, the YOLOv8 network architecture was trained to classify each side of these leaves. LDA-selected bands from the back of the leaves yielded an overall classification accuracy of 84.23%. Leaves with HLB and zinc deficiency were classified most accurately, with F1 scores of 0.977 and 0.953, respectively. On the back side of the leaf, recall of melanose was significantly improved by using the LDA bands. These findings favor the use of supervised band selection for HSI-based in-grove disease detection.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 130530D (2024) https://doi.org/10.1117/12.3013829
This paper investigates the integration of the Kalman filter with fluorescence analysis in biomedical imaging, a synergy that holds the promise of advancing diagnostic accuracy and enhancing research methodologies in the study of biological systems. Employing a rigorous bibliometric analysis through VOSviewer, we explore the key trends, influential research clusters, and seminal publications that have marked the evolution of this interdisciplinary field. The Kalman filter, renowned for its predictive capabilities in real-time signal processing, emerges as a crucial tool for improving the signal-to- noise ratio in fluorescence imaging, thereby facilitating the extraction of more accurate and meaningful data from complex biological phenomena. Our analysis reveals a dynamic and growing research landscape, where methodological advancements and computational challenges intersect with practical applications in biomedical imaging. By highlighting the significant contributions and identifying areas ripe for future investigation, this study underscores the potential of Kalman filter-enhanced fluorescence analysis to revolutionize biomedical diagnostics and imaging, offering new insights into cellular and molecular processes. Through this synthesis, we aim to provide a comprehensive overview of the current state of the art and to chart a course for the next wave of innovations in the field.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX, 130530E (2024) https://doi.org/10.1117/12.3016798
In recent years, there has been significant interest from the scientific community in the fields of precision agriculture assisted through the use of Unmanned Aerial Vehicles (UAVs). This paper introduces a novel system that addresses the challenge of charging station issues in a UAV network. It proposes a system based on a master/slave model where there is a master station connected to different slave stations that use a lightweight application protocol, such as MQTT, for communicating with each other. Specifically, the paper proposes a smart station design aimed at effectively managing UAV fleets by task allocation and activity scheduling. In this way, the fleet of UAVs can recharge the batteries during the mission guaranteeing the completition of the performed task and facilitating an on-demand energy supply to the UAVs that need recharging during the mission.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.