SCIENTIFIC REPORTS, ss.741-765, 2025 (SCI-Expanded, Scopus)
Olive production holds a significant position in the global agricultural
trade. In addition to being influenced by seasonal climatic conditions,
agricultural diseases are another factor that affects olive yield.
Peacock spot disease and olive bud mites are the primary agricultural
diseases affecting olive production. These two diseases cause specific
lesions in the leaves of olive trees. It has been observed that
artificial intelligence approaches such as deep learning and machine
learning are used for early detection of such adverse conditions.
However, the need for high computational processing in the
classification and detection processes of deep learning models limits
the accessibility of these algorithms to all businesses. To address this
challenge, this study proposes a hybrid framework that combines the
robust feature extraction capabilities of deep learning models with the
computational efficiency of machine-learning classifiers. Specifically,
this study analyzes the performance of this combined approach and
compares the results with those of existing deep learning studies in the
literature. As feature extraction deep learning models, MobileNetV2,
DenseNet121, EfficientNetV2B0, and ConvNext Tiny were selected, while
AdaBoost, XGBoost, LightGBM, CatBoost, and Gradient Boosting algorithms
from the Boosting family were included as classifiers. In the model
training, a dataset consisting of 3,400 images of olive leaves belonging
to three classes—healthy, olive_peack_spot, and aculus_olearius—was
used. The experimental results showed that the DenseNet121 + XGBoost
combination achieved a baseline accuracy of 92%. Following a data
augmentation phase to enrich the training data, the model performance
was significantly enhanced, reaching a final accuracy of 94% and a macro
average F1-Score of 94%. The Wilcoxon Signed-Rank test revealed that
the DenseNet121 + XGBoost combination statistically outperformed the
second-best model (p < 0.05). This performance is attributed
to the dense connectivity of DenseNet, which promotes effective feature
reuse and improves gradient flow. Furthermore, the study demonstrates
that a higher number of parameters does not always guarantee better
performance; rather, architectural efficiency plays a crucial role in
avoiding overfitting and ensuring model robustness, as evidenced by
DenseNet121 outperforming the larger ConvNeXtTiny model.