Supporting Smart Agriculture through Crop Health Monitoring Using Deep Learning Techniques (CNN)

Authors

  • Aiman Ahmad Department of Software Engineering, College of Information Technology, University of Gharyan, Gharyan, Libya Author
  • Hatim Almabrouk Department of Software Engineering, College of Information Technology, University of Gharyan, Gharyan, Libya Author
  • Kais Rhoma Department of Software Engineering, College of Information Technology, University of Gharyan, Gharyan, Libya Author
  • Ahmed Brany Department of Software Engineering, College of Information Technology, University of Gharyan, Gharyan, Libya Author

DOI:

https://doi.org/10.65421/jshd.v2i1.136

Keywords:

Diagnosis of Plant Diseases, Mobilenetv3-Large, Efficientnet-B3, Transfer Learning, Flutter Smart Agriculture.

Abstract

This study develops an intelligent plant disease diagnosis system using deep learning and transfer learning techniques within the PyTorch framework. Models were trained on 21,481 images from the PlantVillage dataset, covering 19 crop categories including grapes, peppers, potatoes, and tomatoes. Experiments included a comparison between two convolutional neural network architectures, EfficientNet-B3 and MobileNetV3-Large, to evaluate performance in terms of accuracy and computational efficiency. The results showed that the MobileNetV3-Large model achieved the best performance with an accuracy of 99.31%, fewer coefficients, and shorter training time, making it more suitable for mobile applications. The final model was integrated into a Flutter-based mobile application that provides instant plant disease diagnosis, treatment recommendations, preventative measures, and detailed disease information, making it an effective digital tool for supporting smart agriculture and enhancing agricultural productivity.

Downloads

Published

2026-04-02

Issue

Section

Articles

How to Cite

Supporting Smart Agriculture through Crop Health Monitoring Using Deep Learning Techniques (CNN). (2026). Journal of Scientific and Human Dimensions, 2(1), 882-896. https://doi.org/10.65421/jshd.v2i1.136