AUTHOR=Yadav Pappu Kumar , Burks Thomas , Frederick Quentin , Qin Jianwei , Kim Moon , Ritenour Mark A. TITLE=Citrus disease detection using convolution neural network generated features and Softmax classifier on hyperspectral image data JOURNAL=Frontiers in Plant Science VOLUME=13 YEAR=2022 URL=https://www.frontiersin.org/journals/plant-science/articles/10.3389/fpls.2022.1043712 DOI=10.3389/fpls.2022.1043712 ISSN=1664-462X ABSTRACT=

Identification and segregation of citrus fruit with diseases and peel blemishes are required to preserve market value. Previously developed machine vision approaches could only distinguish cankerous from non-cankerous citrus, while this research focused on detecting eight different peel conditions on citrus fruit using hyperspectral (HSI) imagery and an AI-based classification algorithm. The objectives of this paper were: (i) selecting the five most discriminating bands among 92 using PCA, (ii) training and testing a custom convolution neural network (CNN) model for classification with the selected bands, and (iii) comparing the CNN’s performance using 5 PCA bands compared to five randomly selected bands. A hyperspectral imaging system from earlier work was used to acquire reflectance images in the spectral region from 450 to 930 nm (92 spectral bands). Ruby Red grapefruits with normal, cankerous, and 5 other common peel diseases including greasy spot, insect damage, melanose, scab, and wind scar were tested. A novel CNN based on the VGG-16 architecture was developed for feature extraction, and SoftMax for classification. The PCA-based bands were found to be 666.15, 697.54, 702.77, 849.24 and 917.25 nm, which resulted in an average accuracy, sensitivity, and specificity of 99.84%, 99.84% and 99.98% respectively. However, 10 trials of five randomly selected bands resulted in only a slightly lower performance, with accuracy, sensitivity, and specificity of 98.87%, 98.43% and 99.88%, respectively. These results demonstrate that an AI-based algorithm can successfully classify eight different peel conditions. The findings reported herein can be used as a precursor to develop a machine vision-based, real-time peel condition classification system for citrus processing.