Browsing by Author "Wang, Jinfei"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Crop field monitoring and damage assessment with unmanned aircraft systems and machine learning(University of New Brunswick, 2021-11) Vlachopoulos, Odysseas; Leblon, Brigitte; Wang, JinfeiThe purpose of this dissertation is to examine and develop novel Machine Learning (ML) pipelines for crop field monitoring and damage assessment using Unmanned Aircraft Systems (UAS) equipped with a multispectral MicaSense RedEdge optical sensor for precision agriculture and insurance purposes. The crop fields were prepared for or planted with barley, corn, potato, oat, and soybean crops. The multispectral imagery from the UAS was radiometrically corrected and mosaicked. The multispectral reflectance orthomosaics from each surveyed field were used as input features in various algorithms along with associated vegetation index rasters. Firstly, field areas and boundaries were delineated over multiple bare soil fields with the two following ML pipelines: A supervised pixel-based Random Forests (RF) classifier and an unsupervised clustering process using the Mean Shift algorithm. The vectorization process of the resulting maps resulted in mean Area Goodness of Fit (AGoF) greater than 99% and mean Boundary Mean Positional Error (BMPE) lower than 0.6 m, indicating that both ML pipelines are excellent. Secondly, fully planted fields with barley, corn, and oat were surveyed in order to delineate crop areas and boundaries using Pixel-Based Image Analysis (PBIA) and Geographic Object-Based Image Analysis (GEOBIA) with the RF classifier. Both methodologies were highly successful, with a mean AGoF greater than 98% and a mean BMPE lower than 0.8 m. Thirdly, lodging damage on barley crop fields was mapped from two UAS surveys. An RF model was utilized in order to classify lodged and standing barley with an overall validation accuracy of 99.7%. The average AGoF was 97.95%, and the average BMPE was 0.235 m. Finally, the crop health status was assessed through the Green Area Index (GAI) for barley and oat fields. Multiple Linear Models, Support Vector Machines, RF, and Artificial Neural Networks regression algorithms were used in order to produce Green Area Index (GAI) maps of the fields, with RF performing best for GAI prediction. The GAI maps and the regression feature space were used with an RF classifier to generate health status maps of the crop fields with a mean overall accuracy of 94%.Item Detection of cucumber powdery mildew and potato late blight using hyperspectral data and multispectral imagery(University of New Brunswick, 2021-10) Fernández Espinosa, Claudio Ignacio; Leblon, Brigitte; Wang, JinfeiPlant pathogens are responsible for considerable yield and quality loss in crop production; therefore, new technologies leading to fast detection of disease presence during early stages of infestation are necessary to reduce economic loss. This research analyzed the spectral changes at the leaf and canopy levels in potato plants (Solanum tuberosum) due to potato late blight and cucumber plants (Cucumis sativus) cucumber powdery mildew at the leaf level. Daily reflectance spectra were collected between 400 to 900 nm from the time of inoculation until seven days post-inoculation (DPI) using an Analytical Spectral Device (ASD). For the late blight experiment, data were acquired over 124 healthy and 101 infected leaflets. At the same time, for the canopy level, we collected 32 spectra from the healthy plants and 32 spectra from the infected plants. For the powdery mildew experiment, we acquired spectra of over 71 healthy and 57 infected leaves. The first symptoms showed at 3 DPI for late blight and 6 DPI for powdery mildew. Spectral changes were more substantial for late blight disease at the leaf level than at the canopy level. The spectra were used to compute MicaSense® RedEdge band reflectance to assess whether MicaSense multispectral images can be used for detecting both diseases. The near-infrared (NIR) reflectance was not sensitive to both diseases. All other MicaSense® RedEdge band reflectances have a significant spectra ratio variation at 3 DPI for the late blight (leaf level) and 4 DPI for the late blight (canopy level) and powdery mildew. Among the 16 vegetation indices considered in the study, the best indices were the Redness Index and the Red-Edge Chlorophyll Index for both diseases. For the powdery mildew, we defined a new vegetation index that was better than the Normalized Difference Vegetation Index (NDVI). We also studied two red-edge parameters, the red-well point (RWP) and the red-edge point (REP). The RWP wavelength changes towards longer wavelengths with both diseases. The REP wavelength changes towards shorter wavelengths with both diseases. The RWP changes fitted a linear model for both diseases, while the REP changes were modeled by an exponential model in the case of late blight and by a linear model in the case of powdery mildew. The spectral information was used to sort healthy and infected leaves or plants. During the pre-symptom period of late blight, a trained Partial Least Squares Discriminant Analysis (PLS-DA) was able to classify spectra from healthy and infected samples at 2 DPI with an overall accuracy of 77.77% at the leaf level and of 78.12% at the canopy level. A trained Support Vector Machines (SVM) classifier applied to the reflectance at 668, 705, 717, and 740 nm for the post-symptom period achieved an overall accuracy of 93.33% at 6 DPI at the leaf level and of 89.06% at 5 DPI at the canopy level. A trained SVM applied to the MicaSense band reflectance for powdery mildew achieved an overall accuracy of 93.75% at 4 DPI for the pre-symptom period and 98.44% at 7 DPI for the post-symptom period. For the powdery mildew, we tested the use of MicaSense images in greenhouse conditions. The unregistered images were acquired at a close distance to the plants. They were registered using a Speeded-Up Robust Features (SURF) method that applied an affine geometric transformation. Illumination differences during image acquisition under natural conditions were corrected using a Principal Component Analysis (PCA) color domain approach. The corrected images were then used in red-green-blue (RGB) composites and to compute vegetation indices. These features were input for an SVM classifier trained with 700 healthy and 700 infected pixels and validated over 300 healthy and 300 infected pixels. iv We achieved an overall accuracy on the validation dataset of 89% with the RGB composite for classifying healthy pixels and those with powdery mildew signs.