INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue II, February 2026
In contrast, the proposed system integrates machine learning classifiers with intelligent optimization techniques
including Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and Grid Search. These optimization
methods enhance feature selection and hyperparameter tuning, resulting in improved model generalization and
reduced computational complexity. The experimental results clearly demonstrate that optimized models
outperform non-optimized versions across all evaluation metrics, including accuracy, precision, recall, and F1-
score. Among all evaluated models, the Random Forest classifier optimized using Grid Search achieved the
highest accuracy of 98.5%, significantly surpassing traditional and non-optimized machine learning approaches.
Furthermore, unlike existing systems that focus on a single detection strategy, the proposed framework combines
lightweight detection, hybrid detection, and deep learning-based detection mechanisms within a layered
architecture. This multi-stage approach improves scalability and adaptability for real-time smart grid
environments. Overall, the comparative analysis confirms that the proposed intelligent optimization-based
framework provides superior detection performance, lower false alarm rates, and better scalability compared to
existing smart grid cyber threat detection methods.
Data Flow and Processing Pipeline
The data flow and processing pipeline of the proposed intelligent optimization-based smart grid cyber threat
detection system follows a structured and layered approach to ensure efficient, accurate, and real-time detection
of cyberattacks. The process begins at the field layer, where raw data is generated from smart grid components
such as smart meters, sensors, SCADA systems, and network communication devices. This raw data includes
operational measurements, network traffic logs, voltage and frequency readings, and other system parameters
that may indicate normal or malicious behavior. Since the collected data is heterogeneous and high-dimensional,
it must undergo systematic processing before being used for threat detection.
The raw data is transmitted to the edge or gateway layer, where an initial lightweight detection stage is
performed. This stage quickly filters obvious anomalies and known attack signatures using computationally
efficient techniques. The purpose of this early-stage filtering is to reduce unnecessary processing load on higher
layers and enable faster response to critical threats. After preliminary screening, the filtered data is forwarded to
the control or cloud layer for deeper analysis.
At the control layer, the data undergoes preprocessing steps including data cleaning, normalization, feature
encoding, and dataset splitting. Missing values are handled, redundant attributes are removed, and numerical
features are scaled to ensure consistent model training.
Once preprocessing is completed, feature selection is performed using optimization techniques such as Genetic
Algorithm (GA) and Particle Swarm Optimization (PSO). These algorithms identify the most relevant features
that contribute to accurate classification, thereby reducing computational complexity and improving detection
performance.
The optimized feature set is then fed into multiple classification models including K-Nearest Neighbors (KNN),
Decision Tree, Support Vector Machine (SVM), and Random Forest. Hyperparameter tuning is carried out using
Grid Search and other optimization strategies to ensure that each model operates under optimal conditions. In
addition, deep learning-based detection mechanisms may be applied for identifying complex and previously
unseen attack patterns. Optimization techniques such as AutoML, pruning, quantization, and knowledge
distillation further enhance model efficiency and scalability for real-time deployment.
After model inference, the classification results are evaluated using performance metrics such as accuracy,
precision, recall, and F1-score. If a cyber threat is detected, the system forwards the information to the threat
intelligence module, which stores attack patterns for future reference and continuous learning. Finally, in the
response or orchestration layer, alerts are generated and sent to grid operators, enabling immediate mitigation
actions. This structured data flow ensures efficient processing, reduced latency, improved detection accuracy,
and reliable protection of smart grid infrastructures against modern cyber threats.