INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025
www.ijltemas.in Page 76
Classification of Corn Leaf Diseases Using Convolutional Neural
Network
Adrales, Lorelyn F., Agreda, Dale Anthony B., Estores, Ivan James G*., Provido, Olsen John Gabriel M., Wata, Dustin T
College of Engineering, Architecture and Technology, Notre Dame of Dadiangas University, General Santos City,
Philippines
*Corresponding Author
DOI: https://doi.org/10.51583/IJLTEMAS.2025.1410000010
Received: 02 Oct. 2025; Accepted: 08 Oct. 2025; Published: 28 October 2025
Abstract: The study employs a deep learning model using Convolutional Neural Network (CNN), specifically the ResNet34 model,
for detecting common corn leaf diseases in the municipality of Polomolok and General Santos City, Philippines. Corn, a vital staple
crop and key economic commodity, is highly susceptible to diseases that threaten productivity and food security, making accurate
and efficient detection crucial for effective management and yield optimization. The dataset used in the study comprises 27,146
images categorized into five classifications: Brown Spot, Corn Rust, Leaf Blight, Maize Streak Virus, and Healthy leaves. To
improve model generalization and address class imbalance, data augmentation techniques of Basic Image Manipulation such as
rotation, scaling, flipping, shearing, and color transformation were applied. The proposed model achieved an overall classification
accuracy of 98.67%, with consistently high precision, recall, and F1-scores across all categories, as further validated through
confusion matrix analysis that confirmed its strong performance in distinguishing between disease classes. To extend practical
utility, an initial version of a mobile application called LeafScan was developed, which allows users to take or upload corn leaf
images for disease prediction. These findings demonstrate the effectiveness of deep learning in agriculture, offering a reliable and
scalable tool for disease classification and proactive crop management. The study provides a foundation for implementing AI-
driven agricultural solutions in the Philippines. Future work could focus on developing a fully functional and user tested mobile
expert system application for real-time disease detection and on expanding the model to include additional corn diseases for broader
applicability in precision farming.
I. Introduction
Agriculture is a foundation of the global economy, providing livelihoods for billions while ensuring food security. However,
challenges such as climate change, pests, and plant diseases threaten crop yields, with plant diseases alone causing yield losses
ranging from approximately 5% to over 40%, depending on crop type, disease severity, and management effectiveness [1][2]. As
the global population increases, sustainable food production becomes progressively vital, necessitating innovative strategies to
optimize yields and minimize environmental impact. Smart agricultural practices, particularly early disease detection and
management, play a crucial role in enhancing productivity and securing global food supplies [3].
Corn, a vital staple crop, is highly susceptible to pathogens, affecting yield at all growth stages. Beyond direct consumption, it is
used in products like oil, starch, flour, and biofuel. Disease outbreaks can cause significant losses, leading to food shortages and
financial harm [4]. To address this, researchers have leveraged deep learning models, particularly Convolutional Neural Networks
(CNNs), for disease detection in corn leaf images. Studies have explored various CNN architectures, including AlexNet, VGG16,
VGG19, GoogleNet, Inception-V3, ResNet50, and ResNet101, alongside machine learning methods for classifying corn leaf
diseases [5].
Agriculture is vital to the Philippine economy, employing 40% of the workforce and contributing 20% to the GDP. However, crop
prediction remains a challenge due to diseases, pests, climate change, and environmental factors, affecting farmers' decisions and
livelihoods [6].
Corn is cultivated across 2.5 million hectares, corn yields about eight million metric tons yearly and serves as food, livestock feed,
and raw material for industrial products [7]. To enhance agricultural sustainability and increase farmers' incomes, the Philippines
is integrating advanced technologies, including the Internet of Things (IoT) and Wireless Sensor Networks. Climate-smart practices
such as planting climate-resistant rice, Sloping Agricultural Land Technology (SALT) for soil conservation, and the System of Rice
Intensification (SRI) are being adopted to improve productivity [8] .
Region 12 ranks third in white corn production, contributing 11% of the national output, and second in yellow corn production at
18% [7]. However, regional crop analysis shows that irrigated palay remains the dominant crop, with success rates of 52.18%
compared to 32.67% for yellow corn and 8.63% for white corn. [6] These findings emphasize the need for further research to
enhance agricultural practices and improve production efficiency.
This study aims to develop an artificial intelligence solution using CNNs for classifying corn leaf diseases. Focusing on Polomolok
and General Santos City, the research seeks to empower farmers with a valuable tool to enhance disease management, ultimately
increasing crop yields and promoting agricultural sustainability.
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025
www.ijltemas.in Page 77
II. Literature Review
Convolutional Neural Network (CNN)
CNN has proven to be a highly effective deep learning architecture for image classification, particularly in the detection of corn
leaf diseases. By utilizing convolutional layers for feature extraction, pooling layers to reduce complexity, and fully connected
layers for final predictions, CNNs excel in recognizing intricate visual patterns in agricultural applications [9]. 8 Studies employing
CNN-based models like ResNet34 have demonstrated impressive classification accuracy, reaching up to 97.6% when identifying
diseases such as blight, common rust, and gray leaf spot [10] . Similarly, applications like "MaizeCheck" leverage CNNs to provide
farmers with a rapid and precise method for monitoring crop health, enhancing disease management, and improving agricultural
productivity [11].
To further refine CNN-based disease detection, researchers have explored transfer learning, which enables models to apply
knowledge acquired from extensive datasets to specific agricultural challenges [12]. This technique effectively mitigates data
scarcity and overfitting issues, as demonstrated by studies that fine-tuned pre-trained CNNs on the PlantVillage dataset, achieving
improved classification performance [13]. These advancements further strengthen CNN-based models, improving their reliability
and adaptability in agricultural applications. CNNs continue to play a crucial role in deep learning-driven agriculture, providing
scalable and efficient solutions for automated disease identification and supporting sustainable farming practices.
Corn Leaf Diseases
Different regions where crops are grown have unique water and soil environments and exhibit differences in temperature and
humidity during growing seasons, all of which can affect the pathogen attack process and eventual symptom expression. Hence,
commonly diseases would be different based on different environmental factors [14]. The following diseases are the commonly
found diseases in the municipality of Polomolok and General Santos City:
Brown Spot, it is caused by the fungus Physoderma maydis. Symptoms of brown spot usually appear on mid-canopy leaves. Leaf
lesions are countless, tiny, round to oval, yellowish to brown in color, and usually occur in broad bands across the leaf. [15]
Corn Rust, or common rust, caused by the fungus Puccinia sorghi, is characterized by small, tan spots that develop into elongated,
brick-red to cinnamon-brown pustules. This disease is favored by cool temperatures 16-23°C and high relative humidity 100%.
corn rust can reduce the functional leaf area and photosynthesis of the corn plant. [16]
Leaf Blight, a significant fungal foliar disease affecting corn globally, is caused by either Exserohilum turcicum, Northern Corn
Leaf Blight, or Cochliobolus heterostrophus, Southern Corn Leaf Blight. Characterized by large, cigar-shaped, grayish-tan lesions
on leaves, the disease thrives in cool to moderate temperatures and high humidity. In advanced stages, lesions produce numerous
spores, giving them a darker appearance. This disease directly impacts photosynthesis, leading to reduced grain filling and potential
yield losses of 15-20%, with severe cases exceeding 40% loss. [17] [18]
Maize Streak Virus (MSV), a prevalent plant disease that primarily affects corn crops, causing significant yield losses. It is
transmitted by leafhoppers, specifically the species Cicadulina mbila. MSV can also infect over 80 other grass species, highlighting
its broad host range and potential impact on various agricultural systems. [19]
Mobile Application Development
Studies have optimized ResNet34 for mobile applications, achieving validation accuracies of 86-95% while enhancing
computational efficiency. For tea leaf disease recognition, an improved ResNet34 reached 94.67% test accuracy with a 37.21%
reduction in iteration time [20], while a CBAM-ResNet34 model for strawberry classification achieved 92.36% validation and
87.56% testing accuracy [21].
III. Methodology
The objective of this study is to utilize the model ResNet34, for the classification of corn leaf diseases, including Brown Spot, Corn
Rust, Healthy, Leaf Blight, and Maize Streak Virus (MSV). The pretrained model is fine-tuned and regularized to optimize accuracy
and prevent overfitting. Data augmentation techniques are applied to enhance variability, and the model is trained on a dataset of
27,146 images. Methods and architecture are discussed below.
Data Augmentation
Data augmentation techniques were implemented to simulate real-world variations in image orientation, scale, and lighting. Further,
oversampling was used to expand and balance the dataset, ensuring fair representation across all five classifications. The
preprocessing and augmentation methods included the following techniques:
Resizing. A preprocessing technique that adjusts image dimensions while maintaining aspect ratio or fitting a required size. In deep
learning, resizing ensures images match the network’s expected input dimensions, such as 224 × 224 pixels, standard for CNNs
[22]. It standardizes input data, improves computational efficiency, and ensures consistency.
Rotation. Rotating images at ±20 degrees enhances model robustness against real-life variations [23].
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025
www.ijltemas.in Page 78
Shift Transformation. Pans images in any direction, filling boundaries with edge pixels or zero values. The applied translation range
was ±15% of image dimensions [24].Shear Transformation. Alters image geometry along a specific direction, simulating
perspective changes and object deformations. The applied shear range was ±12 degrees [25] .
Zoom Transformation. Enlarges or shrinks images to modify scale, simulating distance variations. The scaling factor ranged from
0.85 to 1.15 [26].Flip Transformation. Reflects images along vertical or horizontal axes, improving generalization. The flipping
probability was 50% for both directions [27]. Color Transformation. Adjusts brightness (±40%), contrast and saturation (±20%),
and hue (±10%) to simulate varied lighting conditions, enhancing model robustness [28].
Figure 1. Original and Augmented Images Comparison
Figure. 1 presents a comparison between the original images and their augmented counterparts using various Basic Image
Manipulation (BIM) techniques previously described. The augmentations are applied randomly to introduce diverse variations in
the training dataset, enhancing the model's ability to generalize. This visualization highlights how each augmentation contributes
to increasing data diversity, which is crucial for improving the robustness and accuracy of the CNN model.
Table I. Dataset Total Distribution Per Classification
Classes Brown Spot Corn Rust Healthy Leaf Blight Maize Streak
Count 5, 290 5, 506 5, 374 5, 506 5, 470
Total 27, 146 Corn Leaf Images
Table I presents the collected raw images were augmented to generate 5,290 Brown Spot images, 5,506 Corn Rust images, 5,374
Healthy images, 5,506 Leaf Blight images, and 5,470 Maize Streak images. This augmentation process resulted in a total of 27,146
corn leaf images in the dataset.
ResNet34
ResNet34 offers a balanced approach between depth and computational efficiency. It consists of 16 residual blocks arranged into
four groups with increasing feature channels (64, 128, 256, and 512). The network starts with a 7×7 convolutional layer followed
by a 3×3 max pooling layer for initial feature extraction, and each residual block uses two 3×3 convolutional layers with batch
normalization and ReLU activation. With about 21.8 million trainable parameters, ResNet34 delivers strong performance without
high computational demands, making it well-suited for image classification tasks like corn leaf disease detection [29].
Figure 2. Resnet34 Model Architecture [30]
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025
www.ijltemas.in Page 79
The process started with data preprocessing, where a varied dataset of corn leaf images was prepared by resizing, normalizing, and
augmenting the images. [28] For model training, ResNet34 model was employed, and the data was divided into training, validation,
and testing subsets with a 70%, 15%, and 15% distribution respectively. [31]The training process consisted of 20 epochs.
Fine Tuning
Fine-tuning is a machine learning technique that refines a pre-trained model using a specific dataset, allowing it to adapt to new
tasks or improve performance on specialized problems. Instead of training from scratch, fine-tuning adjusts all model parameters
based on a smaller, task-focused dataset [32]. In this study, a pre-trained ResNet34 model was fully fine-tuned on corn leaf images,
updating every layer to capture unique disease patterns. This comprehensive approach enhanced both low-level feature extraction
and high-level classification, leading to improved generalization and more accurate disease detection [33].
Regularization
The researchers devised the following methods to counter overfitting, starting from balancing the dataset through oversampling and
data augmentation. AdamW, an optimizer, is essential for improving model performance, accelerating training, and mitigating
instability. [34] In this study, AdamW was configured with a learning rate of 0.001 and a weight decay of 0.01, effectively
constraining large weight values in the model.
CosineAnnealingLR, a learning rate scheduler, dynamically adjusts the learning rate to guide the model toward an optimal solution.
[28] In the experiment, it was implemented with 20 epochs and a minimum learning rate of 1e-6.
Dropout, a method that deactivates portion of neurons per iteration to prevent reliance on specific features and enhance performance
on unseen imagesin neural networks, improves generalization in deep learning. This study applied a 0.4 dropout rate. [35].
BIM techniques play a crucial role in enhancing the performance of CNN by increasing the diversity of training data. Techniques
such as cropping, flipping, rotation, scaling, and color transformation introduce variations that help the model generalize better to
unseen data. [23] In addition, the researchers employed an oversampling technique to address the imbalanced class distribution.
[36]. The raw dataset initially contained 1,576 images, distributed unevenly across five disease classifications and was expanded to
27,146 images, with each class containing approximately 5,290 to 5,506 images, ensuring a more balanced and diverse training set.
This study presents an a systematic method to agricultural image classification using a pre-trained ResNet34 model, demonstrating
the power of transfer learning in addressing domain-specific challenges. By implementing sophisticated techniques such as
comprehensive data augmentation, advanced regularization, and strategic fine-tuning, the study overcomes critical limitations of
small agricultural datasets. The framework dramatically expanded the original dataset from 1,576 to 27,146 images through
intelligent augmentation and oversampling, ensuring balanced representation across disease classes.
The methodology leverages ResNet34's residual learning architecture to enable deep feature extraction with minimal computational
resources. Key optimizations including dropout, AdamW optimization, and CosineAnnealingLR were employed to enhance model
generalizability and mitigate overfitting. By fine-tuning all model layers and exposing the network to a diverse range of image
variations, the approach significantly improves classification accuracy and robustness, positioning deep learning as a promising
solution for agricultural monitoring and disease detection.
Datasets
Table II. Total Corn Leaves Images Collected
Classes Brown Spot Corn Rust Healthy Leaf Blight Maize Streak
Olympog 268 121 154 16 221
Lagao 130
Glamang 177
Landan 1 26 258 1
Mabuhay 142 61
Total 269 289 215 404 399
Table II shows the number of images for the five classification. In General Santos City, Barangay Olympog gathered 268 Brown
Spot, 121 Corn Rust, 154 Healthy, 16 Leaf Blight, and 221 Maize Streak images. Barangay Lagao gathered 130 Leaf Blight images;
Barangay Mabuhay has 142 Corn Rust and 61 Healthy images. In Polomolok, Barangay Glamang gathered 177 Maize Streak
images, and Barangay Landan gathered 1 Brown Spot, 26 Corn Rust, 258 Leaf Blight, and 1 Maize Streak image. There are 1,576
raw images gathered: 269 Brown Spot, 289 Corn Rust, 215 Healthy, 404 Leaf Blight, and 399 Maize Streak.
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025
www.ijltemas.in Page 80
Figure 3. The Five Corn Leaf Classfication
Figure. 3 presents sample images of the five corn leaf disease classifications used in this study: Brown Spot, Corn Rust, Healthy,
Leaf Blight, and Maize Streak Virus (MSV). Each image illustrates distinct visual symptoms associated with the respective disease,
such as rust-colored pustules, gray lesions, and streak-like patterns. These images serve as reference for model training and
validation in the classification of diseased and healthy corn leaves.
The data collection resulted in 1,576 images that represent a diverse distribution of corn leaf diseases across various locations in
General Santos City and Polomolok. Notably, images for Banded Leaf and Sheath Blight (BLSB) and Downy Mildew were absent,
despite being commonly reported by agricultural experts. This absence could be due to factors such as seasonality, environmental
conditions, effective crop management practices like fungicide application and crop rotation, or the possibility that these diseases
are more localized in areas not included in the survey [37][38].
The image distribution reveals geographical variability, with certain diseases dominating specific areas—such as high counts of
Brown Spot and Maize Streak in Olympog, Leaf Blight in Barangay Landan of Polomolok, and Maize Streak Virus in Glamang. A
key concern is the slight imbalance in the dataset, with fewer Healthy samples compared to classes such as Leaf Blight and Maize
Streak, which could introduce bias in the CNN's performance. This imbalance was mitigated through a rigorous data augmentation
process, thereby enhancing the model's ability to generalize across all categories.
Ethical Considerations
This study involving farmers and agricultural experts has been conducted in accordance with established ethical guidelines and
principles. Prior to data collection, explicit informed consent was obtained from all participants. Participants were provided with
comprehensive information regarding the research objectives, data collection procedures, and the intended use of images and
findings. All participants were assured of their right to withdraw from the study at any time without penalty. Additionally, all
personal information collected from respondents has been maintained as strictly confidential and accessible only to the research
team.
The researchers declare that there are no conflicts of interest that could bias or compromise the integrity of this research. The
research is conducted independently and objectively, with no financial, personal, or professional interests that could influence the
findings or recommendations provided to participants. All sources and data from other researchers and experts have been
appropriately cited and credited to maintain intellectual property standards and avoid plagiarism.
IV. Results And Discussion
Table III. Classification Report of The Result
Classes Brown Spot Corn Rust Healthy Leaf Blight Maize Streak
Accuracy 98.67%
Precision 98.76% 98.76% 99.75% 99.88% 95.57%
Recall 98.97% 99.66% 96.45% 98.30% 99.88%
F1-Score 99.35% 99.21% 98.07% 99.09% 97.68%
Table III presents the classification report of ResNet34 model with 98.67% accuracy in classifying corn leaves. Brown Spot attained
99.74% precision and 98.97% recall, resulting in a 99.35% F1-score. Corn Rust attained 98.76% precision and 99.66% recall,
resulting in a 99.21% F1-score. Healthy leaves attained 99.75% precision but 96.45% recall, resulting in a 98.07% F1-score,
indicating some misclassification. Leaf Blight was almost perfect with 99.88% precision and 98.30% recall, resulting in a 99.09%
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025
www.ijltemas.in Page 81
F1-score. Finally, Maize Streak Virus (MSV) attained 95.57% precision and 99.88% recall, resulting in a 97.68% F1-score,
indicating very good recall with slightly lower precision.
Figure 4. Model’s Confusion Matrix
Figure 4. presents the evaluation of the ResNet34 model’s accuracy in classifying corn leaf diseases into five categories: Brown
Spot, Corn Rust, Healthy, Leaf Blight, and MSV. Out of 4,218 classification tests, the model correctly identified 4,162 cases,
achieving an overall accuracy of 98.72%. Most misclassifications involved MSV, with a few cases of Brown Spot, Corn Rust, and
Leaf Blight being incorrectly labeled. Despite these occasional errors, the model demonstrated strong performance across all disease
categories, effectively distinguishing between different corn leaf conditions.
The high accuracy of 98.67% achieved by ResNet34 aligns with previous studies using similar methodologies and datasets. For
instance, a similar study reported a comparable 97.6% accuracy on corn leaf disease classification using ResNet34 and the
PlantVillage dataset. This reinforces ResNet34’s effectiveness in complex image classification tasks due to its deep residual
architecture and mitigation of the vanishing gradient problem.
Transfer learning, leveraging pre-trained ImageNet weights, plays a crucial role in enhancing performance, especially with limited
data. Additionally, robust data augmentation techniques—rotation, scaling, and color transformations—improve model
generalization. Fine-tuning all ResNet34 layers further strengthens its robustness, aligning with best practices. Prior research
highlights that synthetic augmentation and preprocessing significantly contribute to high model accuracy. [10] Given these factors,
the study’s results confirm ResNet34’s reliability for corn leaf disease classification.
Table IV. Comparison of Model Performance
Model Accuracy Precision Recall F1 Score
Resnet34 98.67% 98.74% 98.65% 98.68%
EfficientNet 98.20% 98.17% 95.25% 98.20%
Alexnet 95.76% 95.82% 95.78% 95.70%
VGG19 96.37% 96.65% 96.23% 96.34%
MobilenetV2 96.18% 96.28% 96.06% 96.13%
Table IV illustrates the comparison of the model reveals that ResNet34 is the best-performing architecture with the highest accuracy
98.67%, precision 98.74%, recall 98.65%, and F1 Score 98.68%. This suggests that ResNet34 is the most reliable model for
predicting corn leaf diseases, showing a perfect balance between precision and recall to provide the least number of false positives
and false negatives. EfficientNet performs closely at 98.20% accuracy but has a lower recall of 95.25%, which implies it tends to
miss certain true cases and is therefore less good even with high precision.
Among the rest of the models, VGG19 accuracy 96.37% and MobileNetV2 accuracy 96.18% perform similarly and are thus viable
alternatives, although they are still behind ResNet34. AlexNet, having the lowest accuracy 95.76%, has the poorest performance.
Although EfficientNet and MobilenetV2 are well-suited for low-resource environments on account of their lightweight architecture,
ResNet34 is still the optimal solution for high-accuracy corn leaf disease classification.
The ResNet34 model's exceptional 98.67% accuracy in corn leaf disease classification represents a significant breakthrough for
agricultural technology and food security. By enabling early and precise disease detection, the model offers transformative potential
for sustainable farming practices. Its remarkable performance—particularly in identifying Brown Spot F1-score 99.35%and Corn
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025
www.ijltemas.in Page 82
Rust F1-score 99.21%—provides farmers with a powerful tool for targeted intervention, potentially reducing crop losses and
minimizing unnecessary pesticide use.
The model demonstrated strong versatility across disease categories, with high recall 96.45% to 99.88%. ResNet34 outperformed
other architectures, but the competitive accuracy of EfficientNet 98.20% and MobileNetV2 96.18% highlights their potential for
mobile-based disease detection in resource-limited settings. With 99.75% precision in identifying healthy leaves and stable
convergence, the model offers a scalable AI-driven solution. Its deployment in agriculture, particularly in developing regions, could
enhance disease management and promote sustainable farming practices.
The LeafScan Mobile Application
Following the evaluation process the researchers developed an initial mobile application for the application of the trained model.
The application integrates the trained Resnet34 model, allowing users to classify corn leaf diseases using captured or uploaded
images. The following figure illustrate key functionalities of the system:
Figure 5. Home Screen of the Mobile Application
Figure. 5 presents the home screen of the mobile application. At the top, the application name “LeafScan” is displayed, followed
by an image input box where users can upload an image for disease classification. Below the input box, users can choose between
two image input methods: “Take Photo” and “Select from Gallery”. At the bottom, the application displays a “Recent Scans” section,
showing the history of recently scanned images.
Figure 6. Prediction Result Screen
Figure. 6 displays the prediction result screen after the user selects and submits an image for analysis. The application processes
the image and provides the closest classification result along with the model’s confidence score. Additionally, the screen presents
a brief description of the identified disease, along with suggested treatments and preventive measures to assist the user in managing
the disease.
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025
www.ijltemas.in Page 83
The LeafScan mobile application prototype successfully implements the ResNet34 model for corn leaf disease classification in a
field-deployable format. Developed using Flutter for cross-platform compatibility and optimized with PyTorch Lite, it achieves
efficient processing, 1 to 5 seconds, and a storage size of 233 MB [39]. Its offline functionality ensures accessibility for farmers in
rural areas with limited internet connectivity [40].
The intuitive interface, featuring image capture, gallery selection, and manipulation tools, enhances usability across different
technological literacy levels [41]. While formal User Acceptance Testing (UAT) is pending, preliminary tests confirm the
application's core functionality, demonstrating the feasibility of integrating ResNet34 into a mobile platform for agricultural disease
diagnosis. Additionally, by incorporating disease descriptions, treatment recommendations, and preventive measures, LeafScan
evolves beyond a classification tool into a comprehensive decision support system for sustainable farming [42].
LeafScan represents a significant advancement in corn disease management by integrating a trained ResNet34 model into a Flutter-
based mobile platform designed for offline use, addressing challenges faced by rural farmers with limited internet access and low-
end devices. Its intuitive interface, with multiple image input methods and adjustments, enhances accessibility across different
technological literacy levels. However, the absence of User Acceptance Testing (UAT) limits insight into its real-world viability,
leaving uncertainties about its precision in diverse field conditions and alignment with agricultural workflows. While its technical
feasibility is evident, practical validation through user testing is necessary. Despite this, LeafScan’s successful model integration
into a mobile application offers a promising tool for agricultural education and sustainable farming.
V. Conclusions
Corn leaf diseases such as Brown Spot, Corn Rust, Leaf Blight, and Maize Streak Virus pose significant threats to crop yields,
underscoring the need for advanced diagnostic methods. This study evaluated a CNN model based on ResNet34, trained on 27,146
corn leaf images, and achieved an impressive 98.67% classification accuracy. The model’s deep 34-layer architecture and residual
connections enhanced its ability to recognize intricate disease patterns, improving classification performance. Furthermore, the
successful integration of the trained model into a mobile application demonstrated its feasibility for real-world deployment,
confirming its potential for practical agricultural use. These findings highlight the effectiveness of deep learning in plant disease
detection and provide a foundation for future research to address challenges such as dataset variability and real-world
implementation in farming environments.
Recommendations
Based on the findings of this study, several areas for further research and improvement are recommended to enhance corn leaf
disease classification. Developing a fully functional expert system that integrates the trained model with a thoroughly conducted
user acceptance test (UAT). Expanding the classification model to include additional corn leaf diseases, such as Banded Leaf and
Sheath Blight and downey mildew, would allow it to diagnose a broader range of infections. Additionally, deeper collaboration
with agricultural experts, plant pathologists, and local farmers would facilitate more extensive data collection and controlled
experiments on disease progression. Addressing these recommendations will help bridge the gap between experimental research
and practical applications, ultimately improving disease detection accuracy and enhancing disease management in corn farming.
References
1. M. M. Jadhav, C. S, M. S. Saharan, S. S. Suroshe, and S. Rajna, “Assessment of avoidable yield losses due to insect pests
and diseases in wheat (Triticum aestivum),” The Indian Journal of Agricultural Sciences, vol. 93, no. 11, Nov. 2023, doi:
10.56093/ijas.v93i11.140245.
2. Atri et al., “Quantification of green fodder yield losses due to diseases and insect-pests infecting cowpea in North-Eastern
and North-Western regions of India,” Legume Research - an International Journal, no. Of, Feb. 2025, doi: 10.18805/lr-
5272.
3. J. M. Lapates, “Corn crop disease detection using convolutional neural network (CNN) to support smart agricultural
farming,” International Journal of Engineering Trends and Technology, vol. 72, no. 6, pp. 195–203, Jun. 2024, doi:
10.14445/22315381/ijett-v72i6p120.
4. M. Fraiwan, E. Faouri, and N. Khasawneh, “Classification of Corn Diseases from Leaf Images Using Deep Transfer
Learning,” Plants, vol. 11, no. 20, p. 2668, Oct. 2022, doi: 10.3390/plants11202668.
5. M. Syarief and W. Setiawan, “Convolutional neural network for maize leaf disease image classification,” TELKOMNIKA
(Telecommunication Computing Electronics and Control), vol. 18, no. 3, p. 1376, Jun. 2020, doi:
10.12928/telkomnika.v18i3.14840.
6. N. F. Alfaro, “Regional Prediction of Crop Yield Success Rate in the Philippines using Geographic Trend Analysis
Algorithm,” Deleted Journal, vol. 20, no. 4s, pp. 703–710, Apr. 2024, doi: 10.52783/jes.2091.
7. M. Salazar, C. D. Elca, G. F. Lapiña, and F. J. D. Salazar, Issues Paper on Corn Industry in the Philippines, PCC Issues
Paper No. 01, Series of 2021. Quezon City, Philippines: Philippine Competition Commission, 2021.
https://www.phcc.gov.ph/storage/pdf-resources/1678085736_PCC-Issues-Paper-2021-01-Issues-Paper-on-Corn-
Industry-in-the-Philippines.pdf
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025
www.ijltemas.in Page 84
8. E. J. Agustin, H. Alcaraz, and D. M. Bristol, "The emergence of disruptive smart farming technologies in the Philippine
agriculture under the new normal," International Journal of Progressive Research in Science and Engineering, vol. 3, no.
3, Mar. 2022. ISSN: 2582-7898.
9. H. Li, X. Xiong, C. Liu, Y. Ma, S. Zeng, and Y. Li, “SFFNET: Staged Feature fusion network of connecting convolutional
neural networks and Graph convolutional neural networks for hyperspectral image classification,” Applied Sciences, vol.
14, no. 6, p. 2327, Mar. 2024, doi: 10.3390/app14062327.
10. G. Sharma, V. Anand, R. Chauhan, N. Garg, and S. Gupta, “Leaf Watch: An Innovative Multi-Class Pre-Trained Model
for Untying the Spectrum of Maize Leaf Diseases,” The Institute of Electrical and Electronics Engineers, Inc. (IEEE), pp.
1–6, Dec. 2023, doi: 10.1109/smartgencon60755.2023.10442289.
11. Tuzon, “MaizeCheck: a web application for identifying maize foliar diseases using convolutional neural network,”
IAMURE International Journal of Ecology and Conservation, vol. 33, no. 1, 2020.
12. H. C. Reis, V. Turk, K. Khoshelham, and S. Kaya, “MediNet: transfer learning approach with MediNet medical visual
database,” Multimedia Tools and Applications, vol. 82, no. 25, pp. 39211–39254, Mar. 2023, doi: 10.1007/s11042-023-
14831-1.
13. W. Shafik, A. Tufail, C. De Silva Liyanage, and R. A. A. H. M. Apong, “Using transfer learning-based plant disease
classification and detection for sustainable agriculture,” BMC Plant Biology, vol. 24, no. 1, Feb. 2024, doi:
10.1186/s12870-024-04825-y.
14. J. Ding, B. Li, C. Xu, Y. Qiao, and L. Zhang, “Diagnosing crop diseases based on domain-adaptive pre-training BERT of
electronic medical records,” Applied Intelligence, vol. 53, no. 12, pp. 15979–15992, Dec. 2022, doi: 10.1007/s10489-022-
04346-x.
15. Y. Jia, Y. Shi, J. Luo, and H. Sun, “Y–Net: Identification of Typical Diseases of Corn Leaves Using a 3D–2D Hybrid
CNN Model Combined with a Hyperspectral Image Band Selection Module,” Sensors, vol. 23, no. 3, p. 1494, Jan. 2023,
doi: 10.3390/s23031494.
16. M. Sibiya and M. Sumbwanyambe, “A computational procedure for the recognition and classification of maize leaf
diseases out of healthy leaves using convolutional neural networks,” AgriEngineering, vol. 1, no. 1, pp. 119–131, Mar.
2019, doi: 10.3390/agriengineering1010009.
17. C. Ashwini and V. Sellam, “EOS-3D-DCNN: Ebola optimization search-based 3D-dense convolutional neural network
for corn leaf disease prediction,” Neural Computing and Applications, vol. 35, no. 15, pp. 11125–11139, Mar. 2023, doi:
10.1007/s00521-023-08289-3.
18. M. B. Kistner, L. Nazar, L. D. Montenegro, G. D. L. Cervigni, E. Galdeano, and J. Iglesias, “Detecting sources of resistance
to multiple diseases in Argentine maize (Zea mays L.) germplasm,” Euphytica, vol. 218, no. 5, Apr. 2022, doi:
10.1007/s10681-022-03000-4.
19. J. Ackora-Prah, B. Seidu, E. Okyere, and J. K. K. Asamoah, “Fractal-Fractional caputo maize streak virus disease model,”
Fractal and Fractional, vol. 7, no. 2, p. 189, Feb. 2023, doi: 10.3390/fractalfract7020189.
20. R. Ye, Y. He, Q. Gao, Y. Gao, G. Shao, and T. Li, “Research on Tea Disease Model Based on Improved ResNet34 and
Transfer Learning,” 2024 International Conference on Intelligent Computing and Robotics (ICICR), vol. 35, pp. 27–34,
Apr. 2024, doi: 10.1109/icicr61203.2024.00015.
21. J. Wang et al., “CBAM-ResNet34-based classification and evaluation method for developmental processes of greenhouse
strawberries,” bioRxiv (Cold Spring Harbor Laboratory), Dec. 2024, doi: 10.1101/2024.12.03.626693.
22. P. Sahu, A. Chug, A. P. Singh, and D. Singh, “Classification of crop leaf diseases using image to image translation with
deep-dream,” Multimedia Tools and Applications, vol. 82, no. 23, pp. 35585–35619, Mar. 2023, doi: 10.1007/s11042-
023-14994-x.
23. H. Bakır, A. N. Çayır, and T. S. Navruz, “A comprehensive experimental study for analyzing the effects of data
augmentation techniques on voice classification,” Multimedia Tools and Applications, vol. 83, no. 6, pp. 17601–17628,
Aug. 2023, doi: 10.1007/s11042-023-16200-4.
24. R. Zhang, B. Zhou, C. Lu, and M. Ma, “The performance research of the data augmentation Method for image
classification,” Mathematical Problems in Engineering, vol. 2022, pp. 1–10, May 2022, doi: 10.1155/2022/2964829.
25. H. Sarker, “Deep Learning: a comprehensive overview on techniques, taxonomy, applications and research directions,”
SN Computer Science, vol. 2, no. 6, Aug. 2021, doi: 10.1007/s42979-021-00815-1.
26. D. Faye, I. Diop, N. Mbaye, and D. Dione, “A combination of data augmentation techniques for mango leaf diseases
classification,” Global Journal of Computer Science and Technology, pp. 1–10, Apr. 2023, doi:
10.34257/gjcstgvol23is1pg1.
27. N. E. Khalifa, M. Loey, and S. Mirjalili, “A comprehensive survey of recent trends in deep learning for digital images
augmentation,” Artificial Intelligence Review, vol. 55, no. 3, pp. 2351–2377, Sep. 2021, doi: 10.1007/s10462-021-10066-
4.
28. S. Lee and S. Lee, “Efficient data augmentation methods for crop disease recognition in sustainable environmental
systems,” Big Data and Cognitive Computing, vol. 9, no. 1, p. 8, Jan. 2025, doi: 10.3390/bdcc9010008.
29. D. Li, Y. Pan, S. Mao, M. Deng, and H. Shen, “Deep residual networks with a Flask-Like channel structure,” IEEE Access,
vol. 12, pp. 11711–11722, Dec. 2023, doi: 10.1109/access.2023.3347331.
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XIV, Issue X, October 2025
www.ijltemas.in Page 85
30. P. Khosravi et al., “Deep Learning Approach for Differentiating Etiologies of Pediatric Retinal Hemorrhages: a
Multicenter study,” International Journal of Molecular Sciences, vol. 24, no. 20, p. 15105, Oct. 2023, doi:
10.3390/ijms242015105.
31. M. Al-Shannaq, S. Al-Khateeb, A. Bsoul, and A. Saifan, “Identification of plant diseases in Jordan using convolutional
neural networks,” Electronics, vol. 13, no. 24, p. 4942, Dec. 2024, doi: 10.3390/electronics13244942.
32. E. Kiyak and G. Unal, “Small aircraft detection using deep learning,” Aircraft Engineering and Aerospace Technology,
vol. 93, no. 4, pp. 671–681, Jun. 2021, doi: 10.1108/aeat-11-2020-0259.
33. R. N. Wanjiku, L. Nderu, and M. Kimwele, “Dynamic fine‐tuning layer selection using Kullback–Leibler divergence,”
Engineering Reports, vol. 5, no. 5, Nov. 2022, doi: 10.1002/eng2.12595.
34. W. Cheng, R. Pu, and B. Wang, “AMC: Adaptive Learning Rate Adjustment based on model complexity,” Mathematics,
vol. 13, no. 4, p. 650, Feb. 2025, doi: 10.3390/math13040650.
35. Z. Zhang and Z.-Q. J. Xu, “Implicit regularization of dropout,” IEEE Transactions on Pattern Analysis and Machine
Intelligence, vol. 46, no. 6, pp. 4206–4217, Jan. 2024, doi: 10.1109/tpami.2024.3357172.
36. P. Theerthagiri, A. U. Ruby, J. G. C. Chandran, T. H. Sardar, and A. S. B. M, “Deep SqueezeNet learning model for
diagnosis and prediction of maize leaf diseases,” Journal of Big Data, vol. 11, no. 1, Aug. 2024, doi: 10.1186/s40537-024-
00972-z.
37. R. Raju, S. Kandhasamy, L. N. Subramanian, and A. N. Marisamy, “Comparative Management Studies on Banded Leaf
and Sheath Blight of Maize Caused by <i>Rhizoctonia Solani F.</i><i>SP</i><i>. Sasakii</i>,” Applied Ecology and
Environmental Sciences/Applied Ecology and Environmental Science, vol. 9, no. 1, pp. 53–57, Nov. 2020, doi:
10.12691/aees-9-1-7.
38. R. S. Mishra and S. Vats, “Epidemiological study of Downy Mildew Disease of opium poppy,” International Journal of
Plant & Soil Science, vol. 36, no. 10, pp. 265–269, Oct. 2024, doi: 10.9734/ijpss/2024/v36i105074.
39. Z. Lv, “Efficient Mobile Deployment of Lightweight CNN-Transformer Model for Crop Disease Diagnosis in Smart
Agriculture,” 2024 4th Asia Conference on Information Engineering (ACIE), pp. 133–137, Jan. 2024, doi:
10.1109/acie61839.2024.00029.
40. S. K. A. Kumar, G. V. Ihita, S. Chaudhari, and P. Arumugam, “A Survey on Rural Internet Connectivity in India,” 2022
14th International Conference on COMmunication Systems & NETworkS (COMSNETS), pp. 911–916, Jan. 2022, doi:
10.1109/comsnets53615.2022.9668358.
41. L. Drewry, J. M. Shutske, D. Trechter, and B. D. Luck, “Assessment of digital technology adoption and access barriers
among agricultural service providers and agricultural extension professionals,” Journal of the ASABE, vol. 65, no. 5, pp.
1049–1059, Jan. 2022, doi: 10.13031/ja.15018.
42. Z. D. Atasoy, “An evaluation of the examples of mobile smart agriculture applications in Turkiye,” BIO Web of
Conferences, vol. 85, p. 01046, Jan. 2024, doi: 10.1051/bioconf/20248501046.