
Citation: Mani Pavuluri, Amber May. I Feel, Therefore, I am: The Insula and Its Role in Human Emotion, Cognition and the Sensory-Motor System[J]. AIMS Neuroscience, 2015, 2(1): 18-27. doi: 10.3934/Neuroscience.2015.1.18
[1] | Feng Qiu, Hui Xu, Fukui Li . Applying modified golden jackal optimization to intrusion detection for Software-Defined Networking. Electronic Research Archive, 2024, 32(1): 418-444. doi: 10.3934/era.2024021 |
[2] | Fukui Li, Hui Xu, Feng Qiu . Modified artificial rabbits optimization combined with bottlenose dolphin optimizer in feature selection of network intrusion detection. Electronic Research Archive, 2024, 32(3): 1770-1800. doi: 10.3934/era.2024081 |
[3] | Mohd. Rehan Ghazi, N. S. Raghava . Securing cloud-enabled smart cities by detecting intrusion using spark-based stacking ensemble of machine learning algorithms. Electronic Research Archive, 2024, 32(2): 1268-1307. doi: 10.3934/era.2024060 |
[4] | Fukui Li, Hui Xu, Feng Qiu . Correction: Modified artificial rabbits optimization combined with bottlenose dolphin optimizer in feature selection of network intrusion detection. Electronic Research Archive, 2024, 32(7): 4515-4516. doi: 10.3934/era.2024204 |
[5] | Sahar Badri . HO-CER: Hybrid-optimization-based convolutional ensemble random forest for data security in healthcare applications using blockchain technology. Electronic Research Archive, 2023, 31(9): 5466-5484. doi: 10.3934/era.2023278 |
[6] | Kai Zheng, Zengshen Ye, Fanchao Wang, Xi Yang, Jianguo Wu . Custom software development of structural dimension optimization software for the buckling of stiffened plates. Electronic Research Archive, 2023, 31(2): 530-548. doi: 10.3934/era.2023026 |
[7] | Yejin Yang, Miao Ye, Qiuxiang Jiang, Peng Wen . A novel node selection method for wireless distributed edge storage based on SDN and a maldistributed decision model. Electronic Research Archive, 2024, 32(2): 1160-1190. doi: 10.3934/era.2024056 |
[8] | Sanqiang Yang, Zhenyu Yang, Leifeng Zhang, Yapeng Guo, Ju Wang, Jingyong Huang . Research on deformation prediction of deep foundation pit excavation based on GWO-ELM model. Electronic Research Archive, 2023, 31(9): 5685-5700. doi: 10.3934/era.2023288 |
[9] | Jianjun Huang, Xuhong Huang, Ronghao Kang, Zhihong Chen, Junhan Peng . Improved insulator location and defect detection method based on GhostNet and YOLOv5s networks. Electronic Research Archive, 2024, 32(9): 5249-5267. doi: 10.3934/era.2024242 |
[10] | Zhenzhong Xu, Xu Chen, Linchao Yang, Jiangtao Xu, Shenghan Zhou . Multi-modal adaptive feature extraction for early-stage weak fault diagnosis in bearings. Electronic Research Archive, 2024, 32(6): 4074-4095. doi: 10.3934/era.2024183 |
Software-defined networking (SDN) is a new type of network architecture that can be structured into three layers: forwarding plane, control plane, and application plane. SDN separates the data plane from the control plane and becomes more flexible and programmable, which improves the flexibility, real-time, and scalability of the network, optimizes communication and management in the network, and improves the efficiency of the entire industrial automation system [1]. Since SDN integrates traditional network functions and simplifies network configuration operations, it is more vulnerable to network attacks, and intrusion detection is an effective means to address SDN security issues [2]. With the development of machine learning, researchers have begun to use traditional machine learning algorithms (such as decision trees, support vector machines, etc.) to build a more complex intrusion detection system (IDS), which improves the accuracy and efficiency of intrusion detection. For intrusion detection systems in traditional networks, researchers have proposed many intrusion detection models (IDM) with good results, and nowadays many researchers apply traditional network intrusion detection methods to intrusion detection in SDNs. Ali et al. [3] designed a lightweight distributed denial of service (DDoS) attack detection and mitigation system based on an improved decision tree, which employs impurity culling and error pruning strategies to detect DDoS attack streams in the traffic and incorporates a dynamic whitelisting mechanism to block the attack traffic from entering the SDN. Madathi et al. [4] used a feature subset of the attack traffic to train the K-nearest neighbor (KNN) algorithm, which is then used to identify anomalous attacks by aggregating and classifying the attack traffic. Maheshwari et al. [5] proposed a novel optimized weighted voting integration model incorporating support vector machine (SVM), random forest (RF), and gradient boosting machine (GBM) for detecting DDoS attacks in SDN environments. Elsayed et al. [6] proposed a secured automatic two-level intrusion detection system (SATIDS) based on an improved long short-term memory (LSTM) network, which distinguishes between attacking and benign traffic and identifies the attack class. Al-Zewairi et al. [7] classified the attack types into type A and type B according to different characteristics and used shallow and deep artificial neural network (ANN) classifiers to detect these two types of unknown types; the results showed that it can effectively realize the problem of classifying the unknown attack types. Wang et al. [8] proposed an improved Naive Bayes (NB) model combined with the attribute addition algorithm, and the results show that the classification accuracy of the new model is significantly improved.
However, both independent machine learning detection methods and joint detection methods are often ineffective as the feature data used relies on manual labor. In order to improve the detection efficiency, researchers usually combine feature selection techniques to select the optimal features. In network intrusion detection systems, feature selection is one of the key steps in data preprocessing, which directly affects the efficacy and accuracy of the classifier. Feature selection aims to identify and select the most useful subset of features from the original dataset to improve model performance and reduce the risk of overfitting.
Cui et al. [9] proposed a feature selection method for high-dimensional data of SDN network data traffic to extract positive features that are effective for model decision-making and constructed a multiclass intrusion detection model based on multiple output nodes, which improved the accuracy of the model for emerging intrusion detection. Wang et al. [10] used RF information gain to select the optimal feature subset and a support vector machine for classification to construct a detection model with a high detection rate for DDoS. Sarica and Angin [11] proposed a solution for real-time intrusion detection and mitigation in SDN based on automatic flow feature extraction and flow classification, while using a random forest classifier at the SDN application layer; experimental results showed that the proposed security method has broad prospects in the implementation of SDN-managed IoT networks. Zhang and Wang [12] proposed a network intrusion detection system (NIDS) based on feature selection and deep learning; the model uses methods such as filtering both non-IPv4-type data packets and consecutive identical packet-in messages and recovering missing information to remove a large number of redundant features and improve the detection accuracy. It is deployed in the SDN network for detecting abnormal traffic. However, the above-proposed methods have disadvantages such as large models, complex algorithms, low detection accuracy, and data that do not reflect the characteristics of the SDN environment.
Our prior work [13,14] involves the integration of classical metaheuristic optimization algorithms for feature selection. However, traditional meta-heuristic algorithms suffer from issues such as low search efficiency. Our prior work [15,16] also applies some new heuristic algorithms that integrate multiple strategies to intrusion detection and SDN networks, which effectively improves the performance of feature selection models and the efficiency of intrusion detection. Our prior work has focused on traditional network intrusion detection, and only preliminary experimental tests have been conducted for SDN intrusion detection.
Spider wasp optimizer (SWO) [17] is an emerging metaheuristic algorithm proposed by Abdel-Basset et al. in 2023. The SWO algorithm has several unique update strategies for a wide range of optimization problems with different exploration and exploitation needs. The SWO algorithm has been successfully applied to real-world optimization problems, one class being constrained engineering design problems such as WBD and pressure vessel design. The second class is to estimate the unknown parameters of the PV models including SDM, dual diode model (DDM), and triple diode model (TDM). In addition, Shtayat et al. [18] proposed an improved binary spider wasp optimization algorithm (IBSWO), which was experimentally verified to have a significant advantage over traditional heuristic algorithms in terms of performance and was applied to IIoT-IDS. The IBSWO primarily converts continuous SWO algorithms into binary SWO algorithms. IBSWO only improves the flat crossover in the SWO algorithm and does not optimize the SWO algorithm for its shortcomings in exploitation and exploration. Mohamed et al. [19] proposed an enhanced binary spider wasp algorithm (BSWO) and used it for high-dimensional feature selection, which effectively reduces the redundant structure of the data and improves the classification accuracy. Compared to SWO, the BSWO has higher time complexity and increases the time overhead. Although the SWO algorithm has strong global search and local exploitation capabilities, it suffers from the problems of unbalanced exploitation, slow convergence at later stages, and the possibility of getting stuck in local optimality. In order to overcome the problems of SWO mentioned above, to solve the challenges brought by facing high-dimensional data, and to improve the detection efficiency for anomalous traffic, in this paper, we propose a multi-strategy enhanced ESWO algorithm for feature selection and construct the intrusion detection model, which optimizes SDN intrusion detection. Tent chaotic map and elite opposition learning strategy increase the diversity and expand the search range of the population. The Levy flight strategy was introduced to avoid the algorithm falling into local optima early in the iteration. The dynamic parameter adjustment strategy balances the exploration and exploitation of the algorithm. The ESWO algorithm is applied to feature selection in intrusion detection to select the optimal subset, reduce data redundancy, and improve the detection efficiency of SDN intrusion detection.
The main contributions of this paper are as follows:
a) In order to improve the limitations of the SWO algorithm when applying intrusion detection for SDN, enhanced strategies are introduced for the ESWO algorithm.
b) An intrusion detection model based on ESWO for SDN is proposed, which uses the ESWO algorithm to solve the problem of data set feature redundancy in intrusion detection and improves the detection efficiency for abnormal attacks.
c) Since the ESWO algorithm is used to feature selection for intrusion detection, the binary version of ESWO is utilized to search for the optimal subset, and the intrusion detection model based on the ESWO algorithm is compared with other algorithms to verify the superiority of the proposed model in detecting abnormal attacks for SDN.
The remainder of this paper is organized as follows: Section 2 introduces the basic algorithm. Section 3 presents enhanced strategies based on the SWO algorithm. Section 4 presents the intrusion detection model based on the ESWO algorithm. Section 5 provides experiments and results. Section 6 summarizes the research results and future development directions of this paper.
The SWO algorithm mainly simulates the behaviors of female spider wasps, such as hunting, nesting, and laying an egg on the abdomen of the spider for parasitism, and uses mathematical models to simulate this behavior in various scenarios.
In the SWO algorithm, each spider wasp (female) represents a solution of the current generation, and the mathematical expression is described as follows:
$ \overrightarrow{SW} = \left[{x}_{1}{, x}_{2}{, x}_{3}, \dots , {x}_{D}\right] $ | (1) |
$ {\overrightarrow{SW}}_{i}^{t} = \overrightarrow{L}+\overrightarrow{r}\times \left(\overrightarrow{H}-\overrightarrow{L}\right) $ | (2) |
where t denotes the generation index, i indicates the population index (i = 1, 2, …, N), $ \overrightarrow{r} $is a vector of D-dimension randomly initialized numbers between 0 and 1, $ \overrightarrow{H} $ is the upper bound of the pre-set parameter, $ \overrightarrow{L} $ is the lower bound of the pre-set parameter, and $ {\overrightarrow{SW}}_{i}^{t} $ represents the ith spider wasp in the t-th round of iteration.
In the initial stage, the female spider wasp randomly explores the search space with a constant step length to search for spider prey suitable for their offspring. In the t + 1 iteration of this stage, the position $ {\overrightarrow{SW}}_{i}^{t} $ of the ith spider wasp mathematical expression is described as follows:
$ {\overrightarrow{SW}}_{i}^{t+1} = {\overrightarrow{SW}}_{i}^{t}+{\mu }_{1}*\left({\overrightarrow{SW}}_{a}^{t}-{\overrightarrow{SW}}_{b}^{t}\right) $ | (3) |
where a and b are two indices selected at random from the population to determine the exploration direction, and $ {\mu }_{1} $ is the coefficient vector using the following formula:
$ {\mu }_{1} = \left|rn\right|*{r}_{1} $ | (4) |
where $ {r}_{1} $ is a number randomly generated at the interval of zero and one, and $ rn $ is also a random number but generated using the normal distribution.
Spider wasps sometimes search the entire area around where the spider has fallen, with mathematical expressions as follows:
$ {\overrightarrow{SW}}_{i}^{t+1} = {\overrightarrow{SW}}_{c}^{t}+{\mu }_{2}*\left(\overrightarrow{L}+\overrightarrow{{r}_{2}}*\left(\overrightarrow{H}-\overrightarrow{L}\right)\right) $ | (5) |
$ {\mu }_{2} = B*\mathit{cos}\left(2\pi l\right) $ | (6) |
$ B = \frac{1}{1+{e}^{l}} $ | (7) |
where c is an index randomly selected from the population, and l is a number randomly generated between 1 and –2.
The mathematical expression of the position update of the spider wasp at this stage is as follows:
$ {\overrightarrow{SW}}_{i}^{t+1} = \left\{Eq(3)r3<r4Eq(5) otherwise \right. $
|
(8) |
where $ {r}_{3} $ and $ {r}_{4} $ are two random numbers in [0, 1].
After the spider wasp catches the target spider, sometimes the spider will escape, and the spider wasp will hunt. At first, the distance between the spider wasp and the spider is very small, and then it will increase or decrease according to the speed of the two. The expression of this behavior is as follows:
$ {\overrightarrow{SW}}_{i}^{t+1} = {\overrightarrow{SW}}_{i}^{t}+C*\left|2*\overrightarrow{{r}_{5}}*{\overrightarrow{SW}}_{a}^{t}-{\overrightarrow{SW}}_{i}^{t}\right| $ | (9) |
$ C = \left(2-2*\left(\frac{t}{{t}_{max}}\right)\right)*{r}_{6} $ | (10) |
where a is an index randomly selected from the population, t and $ {t}_{max} $ indicate the current and maximum evaluation, respectively, $ \overrightarrow{{r}_{5}} $ is a vector that represents the values randomly generated in the interval [0, 1], and $ {r}_{6} $ is a random number in the interval [0, 1].
When the spider escapes the hunt of the spider wasp, the distance between the spider and the spider wasp gradually increases. The expression of this behavior is as follows:
$ {\overrightarrow{SW}}_{i}^{t+1} = {\overrightarrow{SW}}_{i}^{t}*\overrightarrow{vc} $ | (11) |
$ k = 1-\left(\frac{t}{{t}_{max}}\right) $ | (12) |
where $ \overrightarrow{vc} $ is a vector generated between k and −k according to the normal distribution.
The two behaviors above are performed randomly with the expression as follows:
$ {\overrightarrow{SW}}_{i}^{t+1} = \left\{Eq(9)r3<r4Eq(11) otherwise \right. $
|
(13) |
At the start of the optimization process, the spider wasp performs a global search for the optimization problem in the search phase, and the algorithm explores and exploits the areas around the current wasps during the subsequent iteration, The expression for adjusting the two stages is as follows:
$ {\overrightarrow{SW}}_{i}^{t+1} = \left\{Eq(8)p<kEq(13)otherwise \right. $
|
(14) |
where p is a random number in [0, 1].
After the spider wasp catches the target prey, it will drag the prey into the preprepared nest. The spider wasp has different nesting behaviors. The expressions are as follows:
$ {\overrightarrow{SW}}_{i}^{t+1} = {\overrightarrow{SW}}^{*}+\mathit{cos}\left(2\pi l\right)*\left({\overrightarrow{SW}}^{*}-{\overrightarrow{SW}}_{i}^{t}\right) $ | (15) |
$ {\overrightarrow{SW}}_{i}^{t+1} = {\overrightarrow{SW}}_{a}^{t}+{r}_{3}*\left|\gamma \right|*\left({\overrightarrow{SW}}_{a}^{t}-{\overrightarrow{SW}}_{i}^{t}\right)+\left(1-{r}_{3}\right)*\overrightarrow{U}*\left({\overrightarrow{SW}}_{b}^{t}-{\overrightarrow{SW}}_{c}^{t}\right) $ | (16) |
$ \overrightarrow{U} = \left\{1→r4>→r50 otherwise \right. $
|
(17) |
$ {\overrightarrow{SW}}_{i}^{t+1} = \left\{Eq(15)r3<r4Eq(16) otherwise \right. $
|
(18) |
where $ {\overrightarrow{SW}}^{\mathrm{*}} $ represents the best-so-far solution, $ {r}_{3} $ and $ {r}_{4} $is a random number created in the interval [0, 1], $ \gamma $ is a number generated according to the Lévy flight, a, b, and c are indices of three solutions randomly selected from the population, $ \overrightarrow{U} $ is a binary vector used to determine when a step size is applied to avert building two nests at the same position, and $ \overrightarrow{{r}_{4}} $ and $ \overrightarrow{{r}_{5}} $ are two vectors that represent the random values in the interval [0, 1]; if an element of $ \overrightarrow{{r}_{4}} $ is larger than the corresponding element of $ \overrightarrow{{r}_{5}} $, the value of the element corresponding to $ \overrightarrow{U} $ is 1, otherwise the value is 0.
The flowchart of the hunting and nesting behavior is shown in Figure 1. In summary, the expression of the whole behavior is as follows:
$ {\overrightarrow{SW}}_{i}^{t+1} = \left\{Eq(14)i<N∗kEq(18)otherwise \right. $
|
(19) |
One of the main characteristics of spider wasps is their ability to determine gender. Gender is determined based on the size of the host in which an egg is laid. In this algorithm, each spider wasp is the solution of the current generation, and the wasp egg represents the potential solution. The expressions for generating the wasp egg are as follows.
$ {\overrightarrow{SW}}_{i}^{t+1} = Crossover\left({\overrightarrow{SW}}_{i}^{t}, {\overrightarrow{SW}}_{m}^{t}, CR\right) $ | (20) |
$ {\overrightarrow{SW}}_{m}^{t+1} = {\overrightarrow{SW}}_{i}^{t}+{e}^{l}*\left|\beta \right|*\overrightarrow{{v}_{1}}+\left(1-{e}^{l}\right)*\left|{\beta }_{1}\right|*\overrightarrow{{v}_{2}} $ | (21) |
$ \overrightarrow{{v}_{1}} = \left\{→xa−→xif(→xa)<f(→xi)→xi−→xaotherwise \right. $
|
(22) |
$ \overrightarrow{{v}_{2}} = \left\{→xb−→xcf(→xb)<f(→xc)→xc−→xbotherwise \right. $
|
(23) |
where Crossover indicates the uniform crossover operator that is applied between solutions $ {\overrightarrow{SW}}_{i}^{t} $ and $ {\overrightarrow{SW}}_{m}^{t} $ with a probability known as crossover rate (CR), and $ {\overrightarrow{SW}}_{m}^{t} $ and $ {\overrightarrow{SW}}_{i}^{t} $ are two vectors that represent the male and female spider wasps, respectively. $ \beta $ and $ {\beta }_{1} $ are two numbers randomly generated according to the normal distribution, e is the exponential constant, and a, b, and c are indices of three solutions randomly selected from the population; a, b, c and i are all different.
When the spider wasp lays eggs in the host, the nest will be closed, which also indicates that the role of the spider wasp in the optimization process has been completed. In the remaining optimization process, the evaluation of their functions will be handed over to other spider wasps, which is conducive to better results. The number of new populations is updated by the following formula.
$ N = {N}_{min}+\left(N-{N}_{min}\right)\times k $ | (24) |
where $ {N}_{min} $ indicates the minimum number of the population employed to avoid being stuck into local minima within the different stages of the optimization process.
The SWO algorithm has multiple unique exploration strategies and is characterized by fast search speed and high solution accuracy. However, due to its multiple random exploration strategies, the late convergence of the algorithm is slow and can lead to falling into the local optimum. Therefore, in this paper, a tent chaotic map is introduced in the initialization stage, which makes the initialized population more widely distributed and expands the search range. At the beginning of each round of iteration, an elite reverse learning strategy is introduced, which allows the quality of the population to be enhanced throughout the iteration by retaining the better individuals. In the capture and escape phase of the algorithm, the Lévy flight strategy is strengthened to prevent the algorithm from falling into a local optimum in the early stages. The control parameters in the algorithm are also adjusted to enhance the overall optimization ability of the algorithm.
Chaos has randomness and ergodicity, which can accelerate the convergence of the algorithm [20]. The chaotic sequence is generated by a tent map, so that the initial solution is evenly distributed in the search space. The expression of the tent map is shown as follows:
$ {X}_{i+1} = \left\{XiaXi<a(1−Xi)╱(1−a)Xi≥a \right. $
|
(25) |
where i is the number of iterations, $ a\in \left(\mathrm{0, 1}\right) $.
In the population, elite individuals have more effective information than ordinary individuals, and the generation of elite individuals can increase the diversity of the population. The strategy first performs reverse learning on each individual $ {X}_{i} $ in the population to obtain the reverse solution $ {OP}_{i} $ [21]. The formula for generating the reverse solution is as follows:
$ {OP}_{i} = k*\left(ub-lb\right)-{X}_{i} $ | (26) |
where $ k\in \left(\mathrm{0, 1}\right) $ and lb and ub are the lower and upper bounds of search space, respectively. By comparing the fitness value of $ {X}_{i} $ and its reverse solution $ {OP}_{i} $, the elite individuals are obtained by retaining the smaller fitness value. The formula is as follows:
$ {X}_{i} = \left\{Xifitness(Xi)<fitness(OPi)OPiotherwise \right. $
|
(27) |
In the early stage of the pursuit and escape phase of the SWO algorithm, due to the large control factor C of the speed, the algorithm will perform local exploration. As the number of iterations increases, the control factor C gradually decreases, and the algorithm will be transformed into a global exploration at this stage. Therefore, the Lévy flight strategy is also introduced at this stage. The Lévy flight strategy [22] will enhance the spatial search ability of the algorithm and prevent it from falling into a local optimum in the early stages. The mathematical expressions after the introduction of Lévy flight are as follows:
$ {\overrightarrow{SW}}_{i}^{t+1} = {\overrightarrow{SW}}_{i}^{t}+C*\left|2*levy\left(\alpha \right)*{\overrightarrow{SW}}_{a}^{t}-{\overrightarrow{SW}}_{i}^{t}\right| $ | (28) |
$ levy\left(\alpha \right) = 0.05*\frac{u}{{\left|v\right|}^{1/\beta }} $ | (29) |
$ u~N(0, {\sigma }_{u}^{2}) $ | (30) |
$ v~N(0, {\sigma }_{v}^{2}) $ | (31) |
$ {\sigma }_{u} = {\left\{\frac{\mathrm{\Gamma }\left(1+\beta \right)sin\left(\frac{\pi \times \beta }{2}\right)}{\mathrm{\Gamma }\left(1+\beta \right)\times \beta \times {2}^{\frac{\beta -1}{2}}}\right\}}^{\frac{1}{\beta }}, {\sigma }_{v} = 1 $ | (32) |
where $ levy\left(\alpha \right) $ is the step length, u and v follow the normal distribution with mean of 0 and variance of $ {\sigma }_{u}^{2} $ and $ {\sigma }_{v}^{2} $, and the value of β is generally 1.5.
The basic algorithm has two important control parameters TR and CR. TR controls the occurrence of hunting and nesting behavior and mating behavior. According to the sensitivity measurement results of TR, it can be seen that increasing TR can improve the performance of the single-peak test function, and the optimal range of parameters is 0.3–0.5. Reducing TR can improve the performance of multi-modal test functions, and the optimal range of parameters is 0.1–0.3, so the TR value of the basic algorithm is 0.3. In order to improve the comprehensive performance of the algorithm, TR is changed to a dynamic parameter that changes with the number of iterations [23]. The expression is as follows:
$ TR = 0.3*\left(1-\frac{t}{{t}_{max}}\right)+0.2 $ | (33) |
The ESWO algorithm is constructed based on the SWO algorithm and consists of four main sections. In the initiation phase of the algorithm, the tent chaotic map strategy is introduced to increase the diversity of the population. The algorithm's global search capability is improved by utilizing an elite opposition learning strategy at the beginning of each iteration of the algorithm. The inclusion of the Lévy flight strategy in the pursuit and escape phase prevents the algorithm from converging at an early stage. Finally, the dynamic adjustment strategy of control parameters balances the exploration and exploitation of algorithms. The pseudo-code of the ESWO algorithm is shown in Algorithm 1.
Algorithm 1. Pseudo-code of the ESWO |
Input:$ N, {N}_{min}, {t}_{max} $ Output:$ {\overrightarrow{SW}}^{*} $ 1 Initialize N female wasp $ {\overrightarrow{SW}}_{i}^{t}\left(i = \mathrm{1, 2}, \dots, N\right) $, using Eq (2) 2 Evaluate each $ {\overrightarrow{SW}}_{i}^{t} $and finding the onewith the best fitness in $ {\overrightarrow{SW}}^{*} $ 3 t = 1 4 while$ \left(t < {t}_{max}\right) $ 5 Updating population using Eqs (26) and (27) 6 Updating TR using Eq (33) 7 $ {r}_{6} $:generating a random number between 0 and 1 8 If$ \left({r}_{6} < TR\right) $ 9 for i = 1:N do 10 if i < N*k then 11 if p < $ k $ then 12 if $ {r}_{3} < {r}_{4} $ then 13 Applying Eq (3) 14 else 15 Applying Eq (5) 16 end if 17 else 18 if $ {r}_{3} < {r}_{4} $ then 19 Applying Eq (28) 20 else 21 Applying Eq (11) 22 end if 23 else 24 if $ {r}_{3} < {r}_{4} $ then 25 Applying Eq (15) 26 else 27 Applying Eq (16) 28 end if 29 end if 30 Compute $ f\left(\overrightarrow{{SW}_{i}}\right) $ 31 t = t+1; 32 End for 33 Else 34 for i = 1:N 35 Applying Eq (20) 36 t = t+1; 37 End for 38 End if 39 Applying Memory Saving and updating N using Eq (24) 40 End while |
Time complexity is a critical performance metric that measures the operation efficiency of an algorithm. Time complexity is crucial for the timely detection of attacks in intrusion detection [24]. The computing speed of the algorithm affects the detection efficiency of the whole intrusion detection system. Therefore, enhancing the performance of the algorithms is also required to minimize the increase in time complexity. The time complexity of the ESWO algorithm is determined by the population size N, the maximum function evaluation $ {t}_{max} $, and the number of dimensions D. The time complex of the SWO is as follows:
$ T\left(SWO\right) = O\left(Hunting\ and\ nesting\ behaviors\right)+O\left(Mating\ behaviors\right) $ | (34) |
$ T\left(SWO\right) = O\left({t}_{max}DN\right)+O\left({t}_{max}DN\right) = O\left({t}_{max}DN\right) $ | (35) |
Compared to SWO, the extra time overhead of the ESWO algorithm is concentrated in the elite opposition learning strategy. The time complex of the ESWO is as follows:
$ T\left(SWO\right) = O\left(Elite\ opposition\ learning\ strategy \right)+O\left(Hunting\ and\ nesting\ behaviors\right)+O\left(Mating\ behaviors\right) $ | (36) |
$ T\left(SWO\right) = O\left({t}_{max}DN\right)+O\left({t}_{max}DN\right)+O\left({t}_{max}DN\right) = O\left({t}_{max}DN\right) $ | (37) |
As mentioned above, the proposed ESWO algorithm has the same time complexity as the SWO algorithm. The time complexity of the PSO algorithm used in the subsequent experimental section is$ O\left({t}_{max}DN\right) $ [25]. Similarly, the time complexity of the WOA algorithm is $ O\left({t}_{max}DN\right) $. The population size $ {N}_{PSO} $ and $ {N}_{WOA} $ is constant during the iteration process, whereas the population size $ {N}_{SWO} $ and $ {N}_{ESWO} $ becomes progressively smaller during the iteration process due to the population reduction and memory saving. So, the overall time complexity of ESWO and SWO is less than PSO and WOA.
This section discusses the proposed IDM to detect the attack in SDN. In this paper, the ESWO algorithm is used to perform feature selection on the data to select the optimal subset of features and construct the intrusion detection model namely ESWO-IDM for SDN. The construction of the intrusion detection model can be divided into four phases: data preprocessing phase, feature selection phase, training phase, and classification phase. The intrusion detection model construction is shown in Figure 2.
1) Data preprocessing phase
Generally, datasets contain a large number of redundant features, and each data entry contains attributes that represent all the fields of the complete data package. These attributes are available in various forms, such as character types, numeric types, etc. Therefore, it is necessary to transform the unprocessed data into useful information. The data preprocessing stage mainly includes data cleaning, label encoding, and normalization. After the data is preprocessed, the processed data is divided into a training set and a test set in a ratio of 7:3.
Data cleaning: Data cleaning is the process of eliminating or correcting inaccurate, duplicate, or incomplete records and filling in missing values in the data set provided, which is used to improve the accuracy of predictions.
Label encoding: The dataset usually contains categorical variables, and such variations cannot be recognized by machine learning algorithms, so categorical variables need to be transformed into numerical variables.
Normalization: In the original dataset, attributes differed in scale, which resulted in significant differences in the magnitude of attribute values. Normalization scales all attribute values in the dataset to a range of 0 to 1 and mitigate the adverse effects of value differences on model training.
2) Feature selection phase
After grabbing datasets from the web and cleaning the data, simple data filtering has taken shape to some extent. However, since network data is usually very large, there are still many redundant features that are not visible to the naked eye. Feature selection allows further processing of the dataset before intrusion detection. This can effectively reduce redundant features and the amount of data and improve detection accuracy.
In this phase, feature selection is performed on the preprocessed dataset using an intelligent optimization algorithm. Spider wasps are constantly engaged in hunting, nesting, and mating behaviors. When the iteration ends, an optimal subset is finally obtained. The specific flowchart of the ESWO algorithm for feature selection is shown in Figure 3.
3) Training phase
In the training phase, a suitable classifier needs to be chosen. In [26], authors measured the performance of many classifiers, such as SVM, KNN, XGBoost, and RF. The KNN algorithm has the advantages of high accuracy, insensitivity to outliers, and a fast computation time. So, in this paper, KNN is selected to establish the classification model. The training set is used as input to the model that trains the optimized KNN model.
4) Classification phase
Based on the results of the training phase, the final trained model is used to predict the test set and identify the benign types and various attack types. Finally, the results related to SDN intrusion detection are outputted.
The experimental test is carried out in a single environment to ensure the objectivity and fairness of the experiment. An Intel Core i7-13650HX CPU @ 3.00 GHz processor type, Windows 11 operating system, and MATLAB 2022b programming language were used in the experimental settings. In this experiment, eight common benchmark functions were selected, including four unimodal functions (F1–F4) and four multimodal functions (F5–F8). The experiment involves a certain degree of randomness. The test function in the experiment runs independently 30 times. The benchmark test function is shown in Table 1.
Function expressions | Dimension | Range | Min |
$ {F}_{1}\left(X\right)={\sum }_{i=1}^{n}{\left({\sum }_{j=1}^{i}{X}_{j}\right)}^{2} $ | 30 | $ \left[\mathrm{100,100}\right] $ | 0 |
$ {F}_{2}\left(X\right)=\mathrm{m}\mathrm{a}\mathrm{x}\left\{{X}_{i}\right|, 1\ll i\ll n\} $ | 30 | $ \left[-\mathrm{100,100}\right] $ | 0 |
$ {F}_{3}\left(X\right)={\sum }_{i=1}^{n-1}\left[100{\left({x}_{i+1}-{x}_{i}^{2}\right)}^{2}+{\left({x}_{i}-1\right)}^{2}\right] $ | 30 | $ [-\mathrm{30, 30}] $ | 0 |
$ {F}_{4}\left(X\right)={\sum }_{i=1}^{n}{\left([{x}_{i}+0.5]\right)}^{2} $ | 30 | $ \left[-\mathrm{100,100}\right] $ | 0 |
$ {F}_{5}\left(X\right)=\left[{x}_{i}^{2}-10cos\left(2\pi {x}_{i}\right)+10\right] $ | 30 | $ [-\mathrm{5.12, 5.12}] $ | 0 |
$ {F}_{6}\left(X\right)=-20exp\left(-0.2\sqrt{\frac{1}{n}{\sum }_{i=1}^{n}{x}_{i}^{2}}\right)-exp\left(\frac{1}{n}{\sum }_{i=1}^{n}cos\left(2\pi {x}_{i}\right)\right)+20+e $ | 30 | $ [-\mathrm{32, 32}] $ | 0 |
$ {F}_{7}\left(X\right)=\frac{1}{4000}{\sum }_{i=1}^{n}{x}_{i}^{2}-{\prod }_{i=1}^{n}cos\left(\frac{{x}_{i}}{\sqrt{i}}\right)+1 $ | 30 | $ [-\mathrm{600,600}] $ | 0 |
$ {F}_{8}\left(X\right)=\frac{\pi }{n}\{10{sin}^{2}\left(\pi {y}_{i}\right)+{\sum }_{i=1}^{n-1}{\left({y}_{i}-1\right)}^{2}\left[1+10{sin}^{2}\left(\pi {y}_{i+1}\right)\right]+{({y}_{n}-1)}^{2}\}+{\sum }_{i=1}^{n}u\left({X}_{i}, \mathrm{10,100, 4}\right), $ $ {y}_{i}=1+\frac{1}{4}\left({X}_{i}+1\right) $ $ u\left({X}_{i}, a, k, m\right)=\left\{k(Xi−a)m,Xi>a0,−a≤Xi≤ak(−Xi−a)m,Xi<−a \right. $ |
30 | $ \left[-\mathrm{50, 50}\right] $ | 0 |
In this paper, the ESWO algorithm is compared with the basic SWO algorithm, WOA, and PSO. In order to compare different algorithms, the original iteration number in SWO and ESWO is changed from individual iteration to population iteration. The population number is set to 30, and the number of iterations is 1000 times, which are tested in single-peak and multi-peak functions. The final test results are shown in Figure 4 and Table 2.
Function | Type | PSO | WOA | SWO | ESWO |
F1 | Min | 3.39 × 10+02 | 3.87 × 10+03 | 8.47 × 10–140 | 3.77 × 10–160 |
Ave | 3.13 × 10+03 | 1.97 × 10+04 | 1.52 × 10–76 | 1.87 × 10–106 | |
Std | 5.53 × 10+03 | 9.55 × 10+03 | 8.30 × 10–76 | 1.02 × 10–105 | |
F2 | Min | 6.45 × 10+00 | 1.37 × 10+00 | 8.41 × 10–70 | 2.70 × 10–88 |
Ave | 1.49 × 10+01 | 3.71 × 10+01 | 1.62 × 10–42 | 2.55 × 10–58 | |
Std | 4.38 × 10+00 | 2.47 × 10+01 | 8.86 × 10–42 | 1.40 × 10–57 | |
F3 | Min | 2.13 × 10+03 | 2.64 × 10+01 | 2.31 × 10+01 | 2.27 × 10+01 |
Ave | 3.67 × 10+04 | 2.71 × 10+01 | 2.39 × 10+01 | 2.35 × 10+01 | |
Std | 3.80 × 10+04 | 5.91 × 10–01 | 4.33 × 10–01 | 4.14 × 10–01 | |
F4 | Min | 1.46 × 10+02 | 5.17 × 10–03 | 3.98 × 10–08 | 5.50 × 10–08 |
Ave | 5.84 × 10+02 | 6.08 × 10–02 | 8.32 × 10–07 | 1.70 × 10–06 | |
Std | 2.90 × 10+02 | 5.85 × 10–02 | 7.91 × 10–07 | 2.17 × 10–06 | |
F5 | Min | 6.89 × 10+01 | 0.00 × 10+00 | 0.00 × 10+00 | 0.00 × 10+00 |
Ave | 1.14 × 10+02 | 7.58 × 10–15 | 0.00 × 10+00 | 0.00 × 10+00 | |
Std | 2.40 × 10+01 | 2.47 × 10–14 | 0.00 × 10+00 | 0.00 × 10+00 | |
F6 | Min | 5.60 × 10+00 | 4.44 × 10–16 | 4.44 × 10–16 | 4.44 × 10–16 |
Ave | 8.49 × 10+00 | 3.52 × 10–15 | 4.44 × 10–16 | 4.44 × 10–16 | |
Std | 1.42 × 10+00 | 2.42 × 10–15 | 0.00 × 10+00 | 0.00 × 10+00 | |
F7 | Min | 2.35 × 10+00 | 0.00 × 10+00 | 0.00 × 10+00 | 0.00 × 10+00 |
Ave | 5.80 × 10+00 | 6.64 × 10–03 | 0.00 × 10+00 | 0.00 × 10+00 | |
Std | 2.83 × 10+00 | 2.05 × 10–02 | 0.00 × 10+00 | 0.00 × 10+00 | |
F8 | Min | 3.06 × 10+00 | 5.40 × 10–04 | 2.08 × 10–09 | 1.21 × 10–09 |
Ave | 1.26 × 10+01 | 7.15 × 10–03 | 4.07 × 10–08 | 2.18 × 10–08 | |
Std | 7.37 × 10+00 | 6.45 × 10–03 | 4.14 × 10–07 | 1.06 × 10–08 |
The experimental results show that the adaptation change curve of ESWO is progressively optimized, which avoids falling into local optimum as occurs with PSO and WOA. According to the detailed data in Table 2, the standard deviation of ESWO in the functions F5–F7 is 0, which indicates that ESWO has strong stability. The optimization effect of ESWO is also significantly higher than that of other algorithms in the other functions. By combining all eight functions, ESWO achieves comprehensive optimization in performance. Compared with the original algorithm, the optimization accuracy improves a lot.
In summary, experiments of benchmark function testing show that ESWO optimizes the exploration ability of the algorithm compared to SWO and overcomes the disadvantage of slow convergence at the later stages of SWO iterations, as well as further improving the optimization seeking ability of the algorithm.
The number of features selected in feature selection and the classification error rate determine the ability of the feature subset. Therefore, the fitness function expression is as follows:
$ fitness = \alpha \cdot error+\beta \cdot \frac{\left|Nf\right|}{T} $ | (38) |
where fitness represents the best value of individual viable solutions of the population, error indicates the classification error rate, Nf expresses the feature number, T denotes the max iterations, and α and β are two parameters in which $ \alpha $ = 0.99 and $ \beta $ = 0.01.
The SWO algorithm is supposed to address continuous optimization problems. In the space of feature selection, the feature is binary in nature, and it will only appear in two cases: selected and unselected. So, we use the sigmoid function to convert the original solution into a binary solution. Each individual spider wasp is denoted as $ {SW}_{i} = \left\{{x}_{i1}, {x}_{i2}{, \dots, x}_{iD}\right\} $, where D is the feature dimension and d is a random number in [0, D]. When $ {x}_{id} = 1 $, the feature is selected. Otherwise, the feature is not selected. The sigmoid formula is as follows:
$ sigmoid\left({x}_{id}\right) = \frac{1}{1+{e}^{-{x}_{id}}} $ | (39) |
$ {x}_{id} = \left\{1,rand>sigmoid(xid)0,rand<sigmoid(xid) \right. $
|
(40) |
The classifier is KNN. The evaluation indexes of classification detection are accuracy, recall, precision, and F1-score, with unit %.
$ Accuracy = \frac{TP+TN}{TP+TN+FP+FN} $ | (41) |
$ Recall = \frac{TP}{TP+FN} $ | (42) |
$ Precision = \frac{TP}{TP+FP} $ | (43) |
$ F1-score = \frac{2*TP}{2*TP+FP+FN} $ | (44) |
In this paper, UCI datasets are used to evaluate the effectiveness of ESWO in dimensionality reduction. In this section, four datasets are selected, namely Ionosphere, WDBC, Heartstatlog, and Sonar. The measurement results of the four datasets are shown in Table 3.
Algorithms | Metrics | Ionosphere | WDBC | Heartstatlog | Sonar |
PSO | Accuracy (%) | 92.381 | 97.647 | 83.951 | 88.710 |
Recall (%) | 96.807 | 95.238 | 86.667 | 93.103 | |
Precision (%) | 90.411 | 98.361 | 84.783 | 84.375 | |
F1-score (%) | 94.286 | 96.774 | 85.714 | 88.525 | |
Computational time (s) | 9.124 | 10.934 | 12.013 | 10.730 | |
WOA | Accuracy (%) | 90.476 | 97.647 | 85.185 | 88.710 |
Recall (%) | 95.522 | 96.825 | 86.667 | 82.759 | |
Precision (%) | 90.141 | 96.825 | 84.783 | 92.308 | |
F1-score (%) | 92.754 | 96.825 | 85.714 | 87.273 | |
Computational time (s) | 4.278 | 5.343 | 5.892 | 4.942 | |
SWO | Accuracy (%) | 92.381 | 97.647 | 83.951 | 90.323 |
Recall (%) | 95.522 | 95.238 | 88.889 | 96.552 | |
Precision (%) | 92.754 | 98.361 | 83.333 | 84.848 | |
F1-score (%) | 94.118 | 96.774 | 86.022 | 90.323 | |
Computational time (s) | 3.318 | 3.959 | 4.149 | 3.612 | |
ESWO | Accuracy (%) | 93.333 | 99.412 | 87.654 | 96.774 |
Recall (%) | 98.507 | 98.413 | 91.111 | 93.103 | |
Precision (%) | 91.667 | 1 | 87.234 | 1 | |
F1-score (%) | 94.964 | 99.20 | 89.130 | 96.429 | |
Computational time (s) | 3.342 | 4.307 | 4.543 | 3.967 |
Based on the measurement results of four datasets, ESWO is superior to the other algorithms in four evaluation indicators in the WDBC and Heartstatlog datasets. In the Ionosphere dataset, accuracy, recall, and F1-score of the ESWO algorithm are higher than those of the other algorithms. Although precision is lower than for SWO, it is still 1% higher than that of PSO and WOA. In a comprehensive comparison, the ESWO is still the best of the four algorithms. For the Sonar dataset, the recall of ESWO is 3% lower than that of SWO, but other indicators are much higher, and the effect is more than 5% higher. In addition, according to the computation time results, the SWO algorithm is faster than PSO and WOA on the four datasets. The computation time of the ESWO algorithm increased but is still faster than PSO and WOA.
In summary, the ESWO algorithm has the best comprehensive performance in the four datasets, and the accuracy and F1-score values are higher than other algorithms, indicating that ESWO can accurately classify the samples and has better classification performance.
The InSDN dataset is a public dataset specifically for SDN intrusion detection, which aims to improve the performance evaluation and research of IDS in the SDN environment [27,28]. The InSDN dataset not only contains data on different attack types but also covers various key parts of the SDN platform, making the dataset a comprehensive assessment of IDS performance.
This section uses the intrusion detection model based on the ESWO algorithm to perform binary and multiclassification experiments on InSDN. The InSDN dataset contains three files, one of which contains normal data and one other contains attack-type data. In [27], the author selected a subset of 48 features from InSDN for intrusion detection, which are important for detecting attacks. So, we selected the same 48 features for feature selection. The binary classification experiment was performed first. Initially, some redundant feature columns are deleted, the label "normal" is assigned to 0, and the label of other attack-type data is assigned to 1. Then, the data is normalized, and more than 10,000 data entries are randomly selected from the dataset for experimental testing.
Table 4 provides the binary classification results for the InSDN dataset. Results reveal that the ESWO-IDM outperforms other IDMs with 98.833% accuracy, 98.976% precision, and 98.946% F1-score. Accuracy and precision are optimized by 0.2–0.3% over other algorithms, and F1 values are improved by 0.1–0.25%. The four indicators of the ESWO show that the four algorithms are optimal, and the ESWO has the least number of features, which effectively reduces redundant features and workload and improves detection efficiency. Table 4 shows the computational time (CT) of binary classification. Due to the multiple strategies incorporated into the ESWO algorithm, the computation time is slightly improved compared to SWO, 70 seconds faster than PSO, and 13 seconds faster than WOA. Therefore, ESWO-IDM has a faster operation speed. Table 5 shows the construction and prediction time of different models with different data volumes after the feature selection phase. Due to feature selection, which eliminates many redundant features, ESWO-IDM has the fastest prediction time. As the amount of data increases, the overall time savings also increases. The experimental results show the advantages of the ESWO algorithm for feature selection, while the ESWO-IDM improves detection efficiency of intrusion detection.
Metrics | PSO | WOA | SWO | ESWO |
Accuracy (%) | 98.667 | 98.567 | 98.683 | 98.833 |
Recall (%) | 99.037 | 98.916 | 98.856 | 98.916 |
Precision (%) | 98.562 | 98.501 | 98.767 | 98.976 |
F1-score (%) | 98.799 | 98.708 | 98.811 | 98.946 |
Number of features | 11 | 10 | 8 | 6 |
Computational time (sec) | 106.687 | 49.561 | 34.579 | 36.296 |
Data volumes | PSO | WOA | SWO | ESWO |
3000 | 0.03269 | 0.01511 | 0.01302 | 0.01079 |
6000 | 0.05890 | 0.27954 | 0.02319 | 0.01976 |
10,000 | 0.15484 | 0.03281 | 0.03551 | 0.03002 |
Figures 5–7 show experimental results in the InSDN dataset. Although the ROC curve and the AUC result of the four algorithms are extremely similar and all close to excellent, the ESWO algorithms are still slightly better than the others, reflecting the authenticity of the detection method. The fitness curve is also gradually increasing, showing excellent ability to find the best value. Even in the later stage of iteration, the new optimal value will still be found, and the optimal value is the best of the four algorithms. WOA, PSO, and SWO algorithms fall into the local optimum in the later stage of iteration, which indicates a better optimization ability of the ESWO algorithm.
For multiclassification experiments on the InSDN dataset, the labels normal, DoS, DDoS, probe, Brute Force Attack (BFA), web-attack, Botnet, and U2R are replaced with 0, 1, 2, 3, 4, 5, 6, and 7, respectively. The InSDN multiclassification results are shown in Table 6. Since there are fewer data labels of BFA, web-attack, Botnet, U2R in the dataset, all are selected for experimental testing. Other labels are randomly selected by percentage, and 10,000 instances are selected for the experiment. Table 6 shows the results of the InSDN multiclassification experiments. Among the four categories (Normal, DoS, DDoS, and BFA), ESWO-IDM shows optimal results in recall. Compared with the other models, the results are optimized by 0.2–0.5%. ESWO-IDM ranks first in performance in recall metrics and has the best results in DoS, DDoS, and BFA categories in precision. The results are optimized by 0.3–0.8%. SWO-IDS is superior in other categories; in precision metrics, ESWO-IDM as a whole is close to SWO-IDM, both outperforming the other models. ESWO-IDM has the best effect in the categories normal, DoS, DDoS, and BFA in F1-score; the effect is optimized by 0.3–0.4%. In the F1-score metric, the overall results are also optimal for the four models. For the accuracy metric, the ESWO-IDM result is 96.759%, which is 0.15% higher than the second-ranked SWO-IDM. Table 7 shows the CT of the four IDMs. SWO-IDM has the shortest CT, followed by ESWO-IDM, with a close running time. The CT is much higher for PSO-IDM and WOA-IDM. The ESWO-IDM improves overall performance at the cost of a slight increase in CT.
Normal | DoS | DDoS | Probe | BFA | Web-Attack | BOTNET | U2R | ||
PSO | 98.241 | 96.013 | 99.344 | 93.352 | 91.395 | 76.744 | 96.0 | 0 | |
Recall (%) | WOA | 97.918 | 96.013 | 99.558 | 93.37 | 91.608 | 76.744 | 96.0 | 0 |
SWO | 97.99 | 97.651 | 99.559 | 93.407 | 92.217 | 78.571 | 96.0 | 0 | |
ESWO | 98.308 | 97.987 | 99.78 | 93.132 | 92.272 | 76.744 | 96.0 | 0 | |
PSO | 98.562 | 95.695 | 99.342 | 91.576 | 95.157 | 60.0 | 100 | NaN | |
Precision (%) | WOA | 98.366 | 95.695 | 98.684 | 91.848 | 95.157 | 60.0 | 100 | NaN |
SWO | 98.758 | 96.358 | 99.123 | 92.391 | 94.673 | 60.0 | 100 | NaN | |
ESWO | 98.758 | 96.689 | 99.561 | 92.12 | 95.4 | 60.0 | 100 | NaN | |
PSO | 98.401 | 95.854 | 99.670 | 92.455 | 93.238 | 67.347 | 97.959 | NaN | |
F1-score (%) | WOA | 98.142 | 95.854 | 99.119 | 92.603 | 93.349 | 67.347 | 97.959 | NaN |
SWO | 98.372 | 97.0 | 99.341 | 92.896 | 93.429 | 68.041 | 97.959 | NaN | |
ESWO | 98.533 | 97.333 | 99.671 | 92.623 | 93.810 | 67.347 | 97.959 | NaN | |
PSO | 96.476 | ||||||||
Accuracy (%) | WOA | 96.287 | |||||||
SWO | 96.602 | ||||||||
ESWO | 96.759 |
Methods | PSO | WOA | SWO | ESWO |
Computational time (s) | 87.980 | 46.033 | 26.758 | 30.253 |
In summary, the IDM constructed by the ESWO algorithm is optimized in all four metrics, with significant optimization in accuracy, recall, and F1-score metrics, which shows that ESWO has better performance in multiclassification problems, with fewer classification errors and more accurate predictions. For the normal, DoS, and DDoS categories, the original dataset has a large amount of data, while ESWO-IDM shows optimal accuracy, recall, and precision, demonstrating its advantages in detecting abnormal attacks. For BFA, web-attack, and Botnet categories, the original dataset has less data volume, which requires a higher F1-score and recall, and ESWO-IDM presents higher values than the other algorithmic models in these two metrics. ESWO-IDM improves overall accuracy. For DoS, DDoS, and probe attacks with more instances, ESWO-IDM further improves detection accuracy. ESWO-IDM can also accurately detect BFA and Botnet attacks with fewer instances and more complex features. Compared with other models, the optimization effect is obvious, which effectively improves the efficiency of intrusion detection in SDN.
In order to improve the detection efficiency of intrusion detection in SDN networks, this paper proposes an ESWO algorithm that incorporates multiple strategies for feature selection to pick the optimal subset of features and constructs the ESWO-IDM for SDN intrusion detection. First, the global search ability of ESWO is verified using the benchmark test functions. Subsequently, the superior classification ability of the ESWO algorithm in feature selection is verified using the UCI datasets. Finally, the constructed ESWO-IDM is subjected to binary and multiclassification experiments using the InSDN dataset, in which it is compared with other intrusion detection models based on SWO, PSO, and WOA. In the binary classification experiments, the proposed ESWO-IDM has 98.833, 98.976, and 98.946% accuracy, precision, and F1-score metrics, respectively, which are higher than those of the other compared IDMs. In the multiclassification experiments, the results of the ESWO-IDM are higher than those of other IDMs in normal, DoS, DDoS, and BFA categories; the comprehensive detection ability of the ESWO-IDM is optimal. The experimental results indicate that the proposed ESWO-IDM optimizes SDN intrusion detection. As the InSDN dataset is relatively small in quantity, further research and improvement are needed in subsequent work. Future work will conduct experimental simulations using the SDN environment to further test and improve the detection capability of the proposed ESWO-IDM for SDN.
The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.
This work has been supported by the National Natural Science Foundation of China under Grant 61602162 and the Hubei Provincial Science and Technology Plan Project under Grant 2023BCB041.
The authors declare there is no conflict of interest.
[1] |
Klumpp H, Fitzgerald DA, Cook E, et al. (2014) Serotonin transporter gene alters insula activity to threat in social anxiety disorder. Neuroreport 25: 926-931. doi: 10.1097/WNR.0000000000000210
![]() |
[2] |
3. Pavuluri MN, Passarotti AM, Fitzgerald JM, et al. (12) Risperidone and divalproex differentially engage the fronto-striato-temporal circuitry in pediatric mania: a pharmacological functional magnetic resonance imaging study. J Am Acad Child Adolesc Psychiatry 51: 157-170 e155. doi: 10.1016/j.jaac.2011.10.019
![]() |
[3] |
4. Menon V, Uddin LQ (2010) Saliency, switching, attention and control: a network model of insula function. Brain Struct Funct 214: 655-667. doi: 10.1007/s00429-010-0262-0
![]() |
[4] | 5. Craig AD (2009) How do you feel——now? The anterior insula and human awareness. Nat Rev Neurosci 10: 59-70. |
[5] |
6. Mutschler I, Schulze-Bonhage A, Glauche V, et al. (2007) A rapid sound-action association effect in human insular cortex. PLoS One 2: e2. doi: 10.1371/journal.pone.0000259
![]() |
[6] |
7. McGrath CL, Kelley ME, Holtzheimer PE, et al. (2013) Toward a neuroimaging treatment selection biomarker for major depressive disorder. JAMA Psychiatry 70: 821-829. doi: 10.1001/jamapsychiatry.2013.143
![]() |
[7] |
8. Uddin LQ, Menon V (2009) The anterior insula in autism: under-connected and under-examined. Neurosci Biobehav Rev 33: 1198-1203. doi: 10.1016/j.neubiorev.2009.06.002
![]() |
[8] |
9. Augustine JR (1996) Circuitry and functional aspects of the insular lobe in primates including humans. Brain Res Brain Res Rev 22: 229-244. doi: 10.1016/S0165-0173(96)00011-2
![]() |
[9] |
10. Stephani C, Fernandez-Baca Vaca G, Maciunas R, et al. (2011) Functional neuroanatomy of the insular lobe. Brain Struct Funct 216: 137-14 doi: 10.1007/s00429-010-0296-3
![]() |
[10] |
11. Allman JM, Tetreault NA, Hakeem AY, et al. (20 The von Economo neurons in frontoinsular and anterior cingulate cortex in great apes and humans. Brain Struct Funct 214: 495-517. doi: 10.1007/s00429-010-0254-0
![]() |
[11] |
12. Cauda F, Torta DM, Sacco K, et al. (2013) Functional anatomy of cortical areas characterized by Von Economo neurons. Brain Struct Funct 218: 1-20. doi: 10.1007/s00429-012-0382-9
![]() |
[12] | 13. Craig AD (2002) How do you feel? Interoception: the sense of the physiological condition of the body. Nat Rev Neurosci 3: 655-666. |
[13] |
14. Critchley HD, Wiens S, Rotshtein P, et al. (2004) Neural systems supporting interoceptive awareness. Nat Neurosci 7: 189-195. doi: 10.1038/nn1176
![]() |
[14] |
15. Bartels A, Zeki S (2004) The neural correlates of maternal and romantic love. Neuroimage 21:1155-1166. doi: 10.1016/j.neuroimage.2003.11.003
![]() |
[15] |
16. Leibenluft E, Gobbini MI, Harrison T, et al. (2004) Mothers' neural activation in response to pictures of their children and other children. Biol Psychiatry 56: 225-232. doi: 10.1016/j.biopsych.2004.05.017
![]() |
[16] |
17. Koelsch S, Fritz T, DY VC, et al. (2006) Investigating emotion with music: an fMRI study. Hum Brain Mapp 27: 239-250. doi: 10.1002/hbm.20180
![]() |
[17] |
18. Johnstone T, van Reekum CM, Oakes TR, et al. (2006) The voice of emotion: an FMRI study of neural responses to angry and happy vocal expressions. Soc Cogn Affect Neurosci 1: 242-249. doi: 10.1093/scan/nsl027
![]() |
[18] |
19. Santos M, Uppal N, Butti C, et al. (2011) Von Economo neurons in autism: a stereologic study of the frontoinsular cortex in children. Brain Res 1380: 206-217. doi: 10.1016/j.brainres.2010.08.067
![]() |
[19] |
20. Pavuluri MN, Passarotti AM, Lu LH, et al. (2011) Double-blind randomized trial of risperidone versus divalproex in pediatric bipolar disorder: fMRI outcomes. Psychiatry Res : 28-37. doi: 10.1016/j.pscychresns.2011.01.005
![]() |
[20] | 21. Wegbreit E, Pavuluri M (2) Mechanistic comparisons of functional domains across pediatric and adult bipolar disorder highlight similarities, as well as differences, influenced by the developing brain. Isr J Psychiatry Relat Sci 49: 75-83. |
[21] |
22. Yang H, Lu LH, Wu M, et al. (2013) Time course of recovery showing initial prefrontal cortex changes at 16 weeks, extending to subcortical changes by 3 years in pediatric bipolar disorder. J Affect Disord 150: 571-577. doi: 10.1016/j.jad.2013.02.007
![]() |
[22] |
23. Wu M, Lu LH, Passarotti AM, et al. (2013) Altered affective, executive and sensorimotor resting state networks in patients with pediatric mania. J Psychiatry Neurosci 38: 232-240. doi: 10.1503/jpn.120073
![]() |
[23] |
24. Greicius MD, Supekar K, Menon V, et al. (2009) Resting-state functional connectivity reflects structural connectivity in the default mode network. Cereb Cortex 19: 72-78. doi: 10.1093/cercor/bhn059
![]() |
[24] | 25. Fox MD, Greicius M (2010) Clinical applications of resting state functional connectivity. Front Syst Neurosci 4: 19. |
[25] |
26. Seeley WW, Menon V, Schatzberg AF, et al. (2007) Dissociable intrinsic connectivity networks for salience processing and executive control. J Neurosci 27: 2349-2356. doi: 10.1523/JNEUROSCI.5587-06.2007
![]() |
[26] | 27. Seeley WW (2009) Frontotemporal dementia neuroimaging: a guide for clinicians. Front Neurol Neurosci 24: 160-167. |
[27] |
28. Seeley WW, Carlin DA, Allman JM, et al. (2006) Early frontotemporal dementia targets neurons unique to apes and humans. Ann Neurol 60: 660-667. doi: 10.1002/ana.21055
![]() |
[28] |
29. Pavuluri MN, Ellis JA, Wegbreit E, et al. (2012) Pharmacotherapy impacts functional connectivity among affective circuits during response inhibition in pediatric mania. Behav Brain Res 226: 493-503. doi: 10.1016/j.bbr.2011.10.003
![]() |
[29] |
30. Ramautar JR, Slagter HA, Kok A, et al. (2006) Probability effects in the stop-signal paradigm: the insula and the significance of failed inhibition. Brain Res 1105: 143-154. doi: 10.1016/j.brainres.2006.02.091
![]() |
[30] |
31. Pavuluri MN, O'Connor MM, Harral EM, et al. (2008) An fMRI study of the interface between affective and cognitive neural circuitry in pediatric bipolar disorder. Psychiatry Res 162:244-255. doi: 10.1016/j.pscychresns.2007.10.003
![]() |
[31] |
32. Thielscher A, Pessoa L (2007) Neural correlates of perceptual choice and decision making during fear-disgust discrimination. J Neurosci 27: 2908-2917. doi: 10.1523/JNEUROSCI.3024-06.2007
![]() |
[32] | 33. Wang XL, Du MY, Chen TL, et al. (2014) Neural correlates during working memory processing in major depressive disorder. Prog Neuropsychopharmacol Biol Psychiatry 56C: 101-108. |
[33] |
34. Di Martino A, Shehzad Z, Kelly C, et al. (2009) Relationship between cingulo-insular functional connectivity and autistic traits in neurotypical adults. Am J Psychiatry 166: 891-899. doi: 10.1176/appi.ajp.2009.08121894
![]() |
[34] |
35. Allman JM, Watson KK, Tetreault NA, et al. (2005) Intuition and autism: a possible role for Von Economo neurons. Trends Cogn Sci 9: 367-373. doi: 10.1016/j.tics.2005.06.008
![]() |
[35] |
36. Jankowiak-Siuda K, Zajkowski W (2013) A neural model of mechanisms of empathy deficits in narcissism. Med Sci Monit 19: 934-941. doi: 10.12659/MSM.889593
![]() |
[36] | 37. Hauser TU, Iannaccone R, Walitza S, et al. (2014) Cognitive flexibility in adolescence: Neural and behavioral mechanisms of reward prediction error processing in adaptive decision making during development. Neuroimage. |
[37] |
38. Crone EA, Dahl RE (2012) Understanding adolescence as a period of social-affective engagement and goal flexibility. Nat Rev Neurosci 13: 636-650. doi: 10.1038/nrn3313
![]() |
[38] |
39. Preuschoff K, Quartz SR, Bossaerts P (2008) Human insula activation reflects risk prediction errors as well as risk. J Neurosci 28: 2745-2752. doi: 10.1523/JNEUROSCI.4286-07.2008
![]() |
[39] |
40. Lang PJ (1994) The varieties of emotional experience: a meditation on James-Lange theory. Psychol Rev 101: 211-221. doi: 10.1037/0033-295X.101.2.211
![]() |
[40] |
41. Pavuluri MN, O'Connor MM, Harral E, et al. (2007) Affective neural circuitry during facial emotion processing in pediatric bipolar disorder. Biol Psychiatry 62: 158-167. doi: 10.1016/j.biopsych.2006.07.011
![]() |
[41] |
42. Strawn JR, Keeshin BR, DelBello MP, et al. (2010) Psychopharmacologic treatment of posttraumatic stress disorder in children and adolescents: a review. J Clin Psychiatry 71:932-9 doi: 10.4088/JCP.09r05446blu
![]() |
[42] |
43. Wiech K, Jbabdi S, Lin CS, et al. (2014) Differential structural and resting state connectivity between insular subdivisions and other pain-related brain regions. Pain 155: 2047-2055. doi: 10.1016/j.pain.2014.07.009
![]() |
[43] |
44. Craig AD (2003) A new view of pain as a homeostatic emotion. Trends Neurosci 26: 303-307. doi: 10.1016/S0166-2236(03)00123-1
![]() |
[44] | 45. Ogino Y, Nemoto H, Inui K, et al. (2007) Inner experience of pain: imagination of pain while viewing images showing painful events forms subjective pain representation in human brain. Cereb Cortex 17: 1139-1146. |
[45] |
46. Pavuluri M, May A (2014) Differential Treatment of Pediatric Bipolar Disorder and Attention Deficit Hyperactivity Disorder. Psychiatric Annals 44: 471-480. doi: 10.3928/00485713-20141003-06
![]() |
[46] |
47. Paul NA, Stanton SJ, Greeson JM, et al. (2013) Psychological and neural mechanisms of trait mindfulness in reducing depression vulnerability. Soc Cogn Affect Neurosci 8: 56-64. doi: 10.1093/scan/nss070
![]() |
[47] | 48. Kirk U, Brown KW, Downar J (2014) Adaptive neural reward processing during anticipation and receipt of monetary rewards in mindfulness meditators. Soc Cogn Affect Neurosci. |
[48] | 49. Rolland B, Amad A, Poulet E, et al. (2014) Resting-State Functional Connectivity of the Nucleus Accumbens in Auditory and Visual Hallucinations in Schizophrenia. Schizophr Bull. |
Function expressions | Dimension | Range | Min |
$ {F}_{1}\left(X\right)={\sum }_{i=1}^{n}{\left({\sum }_{j=1}^{i}{X}_{j}\right)}^{2} $ | 30 | $ \left[\mathrm{100,100}\right] $ | 0 |
$ {F}_{2}\left(X\right)=\mathrm{m}\mathrm{a}\mathrm{x}\left\{{X}_{i}\right|, 1\ll i\ll n\} $ | 30 | $ \left[-\mathrm{100,100}\right] $ | 0 |
$ {F}_{3}\left(X\right)={\sum }_{i=1}^{n-1}\left[100{\left({x}_{i+1}-{x}_{i}^{2}\right)}^{2}+{\left({x}_{i}-1\right)}^{2}\right] $ | 30 | $ [-\mathrm{30, 30}] $ | 0 |
$ {F}_{4}\left(X\right)={\sum }_{i=1}^{n}{\left([{x}_{i}+0.5]\right)}^{2} $ | 30 | $ \left[-\mathrm{100,100}\right] $ | 0 |
$ {F}_{5}\left(X\right)=\left[{x}_{i}^{2}-10cos\left(2\pi {x}_{i}\right)+10\right] $ | 30 | $ [-\mathrm{5.12, 5.12}] $ | 0 |
$ {F}_{6}\left(X\right)=-20exp\left(-0.2\sqrt{\frac{1}{n}{\sum }_{i=1}^{n}{x}_{i}^{2}}\right)-exp\left(\frac{1}{n}{\sum }_{i=1}^{n}cos\left(2\pi {x}_{i}\right)\right)+20+e $ | 30 | $ [-\mathrm{32, 32}] $ | 0 |
$ {F}_{7}\left(X\right)=\frac{1}{4000}{\sum }_{i=1}^{n}{x}_{i}^{2}-{\prod }_{i=1}^{n}cos\left(\frac{{x}_{i}}{\sqrt{i}}\right)+1 $ | 30 | $ [-\mathrm{600,600}] $ | 0 |
$ {F}_{8}\left(X\right)=\frac{\pi }{n}\{10{sin}^{2}\left(\pi {y}_{i}\right)+{\sum }_{i=1}^{n-1}{\left({y}_{i}-1\right)}^{2}\left[1+10{sin}^{2}\left(\pi {y}_{i+1}\right)\right]+{({y}_{n}-1)}^{2}\}+{\sum }_{i=1}^{n}u\left({X}_{i}, \mathrm{10,100, 4}\right), $ $ {y}_{i}=1+\frac{1}{4}\left({X}_{i}+1\right) $ $ u\left({X}_{i}, a, k, m\right)=\left\{k(Xi−a)m,Xi>a0,−a≤Xi≤ak(−Xi−a)m,Xi<−a \right. $ |
30 | $ \left[-\mathrm{50, 50}\right] $ | 0 |
Function | Type | PSO | WOA | SWO | ESWO |
F1 | Min | 3.39 × 10+02 | 3.87 × 10+03 | 8.47 × 10–140 | 3.77 × 10–160 |
Ave | 3.13 × 10+03 | 1.97 × 10+04 | 1.52 × 10–76 | 1.87 × 10–106 | |
Std | 5.53 × 10+03 | 9.55 × 10+03 | 8.30 × 10–76 | 1.02 × 10–105 | |
F2 | Min | 6.45 × 10+00 | 1.37 × 10+00 | 8.41 × 10–70 | 2.70 × 10–88 |
Ave | 1.49 × 10+01 | 3.71 × 10+01 | 1.62 × 10–42 | 2.55 × 10–58 | |
Std | 4.38 × 10+00 | 2.47 × 10+01 | 8.86 × 10–42 | 1.40 × 10–57 | |
F3 | Min | 2.13 × 10+03 | 2.64 × 10+01 | 2.31 × 10+01 | 2.27 × 10+01 |
Ave | 3.67 × 10+04 | 2.71 × 10+01 | 2.39 × 10+01 | 2.35 × 10+01 | |
Std | 3.80 × 10+04 | 5.91 × 10–01 | 4.33 × 10–01 | 4.14 × 10–01 | |
F4 | Min | 1.46 × 10+02 | 5.17 × 10–03 | 3.98 × 10–08 | 5.50 × 10–08 |
Ave | 5.84 × 10+02 | 6.08 × 10–02 | 8.32 × 10–07 | 1.70 × 10–06 | |
Std | 2.90 × 10+02 | 5.85 × 10–02 | 7.91 × 10–07 | 2.17 × 10–06 | |
F5 | Min | 6.89 × 10+01 | 0.00 × 10+00 | 0.00 × 10+00 | 0.00 × 10+00 |
Ave | 1.14 × 10+02 | 7.58 × 10–15 | 0.00 × 10+00 | 0.00 × 10+00 | |
Std | 2.40 × 10+01 | 2.47 × 10–14 | 0.00 × 10+00 | 0.00 × 10+00 | |
F6 | Min | 5.60 × 10+00 | 4.44 × 10–16 | 4.44 × 10–16 | 4.44 × 10–16 |
Ave | 8.49 × 10+00 | 3.52 × 10–15 | 4.44 × 10–16 | 4.44 × 10–16 | |
Std | 1.42 × 10+00 | 2.42 × 10–15 | 0.00 × 10+00 | 0.00 × 10+00 | |
F7 | Min | 2.35 × 10+00 | 0.00 × 10+00 | 0.00 × 10+00 | 0.00 × 10+00 |
Ave | 5.80 × 10+00 | 6.64 × 10–03 | 0.00 × 10+00 | 0.00 × 10+00 | |
Std | 2.83 × 10+00 | 2.05 × 10–02 | 0.00 × 10+00 | 0.00 × 10+00 | |
F8 | Min | 3.06 × 10+00 | 5.40 × 10–04 | 2.08 × 10–09 | 1.21 × 10–09 |
Ave | 1.26 × 10+01 | 7.15 × 10–03 | 4.07 × 10–08 | 2.18 × 10–08 | |
Std | 7.37 × 10+00 | 6.45 × 10–03 | 4.14 × 10–07 | 1.06 × 10–08 |
Algorithms | Metrics | Ionosphere | WDBC | Heartstatlog | Sonar |
PSO | Accuracy (%) | 92.381 | 97.647 | 83.951 | 88.710 |
Recall (%) | 96.807 | 95.238 | 86.667 | 93.103 | |
Precision (%) | 90.411 | 98.361 | 84.783 | 84.375 | |
F1-score (%) | 94.286 | 96.774 | 85.714 | 88.525 | |
Computational time (s) | 9.124 | 10.934 | 12.013 | 10.730 | |
WOA | Accuracy (%) | 90.476 | 97.647 | 85.185 | 88.710 |
Recall (%) | 95.522 | 96.825 | 86.667 | 82.759 | |
Precision (%) | 90.141 | 96.825 | 84.783 | 92.308 | |
F1-score (%) | 92.754 | 96.825 | 85.714 | 87.273 | |
Computational time (s) | 4.278 | 5.343 | 5.892 | 4.942 | |
SWO | Accuracy (%) | 92.381 | 97.647 | 83.951 | 90.323 |
Recall (%) | 95.522 | 95.238 | 88.889 | 96.552 | |
Precision (%) | 92.754 | 98.361 | 83.333 | 84.848 | |
F1-score (%) | 94.118 | 96.774 | 86.022 | 90.323 | |
Computational time (s) | 3.318 | 3.959 | 4.149 | 3.612 | |
ESWO | Accuracy (%) | 93.333 | 99.412 | 87.654 | 96.774 |
Recall (%) | 98.507 | 98.413 | 91.111 | 93.103 | |
Precision (%) | 91.667 | 1 | 87.234 | 1 | |
F1-score (%) | 94.964 | 99.20 | 89.130 | 96.429 | |
Computational time (s) | 3.342 | 4.307 | 4.543 | 3.967 |
Metrics | PSO | WOA | SWO | ESWO |
Accuracy (%) | 98.667 | 98.567 | 98.683 | 98.833 |
Recall (%) | 99.037 | 98.916 | 98.856 | 98.916 |
Precision (%) | 98.562 | 98.501 | 98.767 | 98.976 |
F1-score (%) | 98.799 | 98.708 | 98.811 | 98.946 |
Number of features | 11 | 10 | 8 | 6 |
Computational time (sec) | 106.687 | 49.561 | 34.579 | 36.296 |
Data volumes | PSO | WOA | SWO | ESWO |
3000 | 0.03269 | 0.01511 | 0.01302 | 0.01079 |
6000 | 0.05890 | 0.27954 | 0.02319 | 0.01976 |
10,000 | 0.15484 | 0.03281 | 0.03551 | 0.03002 |
Normal | DoS | DDoS | Probe | BFA | Web-Attack | BOTNET | U2R | ||
PSO | 98.241 | 96.013 | 99.344 | 93.352 | 91.395 | 76.744 | 96.0 | 0 | |
Recall (%) | WOA | 97.918 | 96.013 | 99.558 | 93.37 | 91.608 | 76.744 | 96.0 | 0 |
SWO | 97.99 | 97.651 | 99.559 | 93.407 | 92.217 | 78.571 | 96.0 | 0 | |
ESWO | 98.308 | 97.987 | 99.78 | 93.132 | 92.272 | 76.744 | 96.0 | 0 | |
PSO | 98.562 | 95.695 | 99.342 | 91.576 | 95.157 | 60.0 | 100 | NaN | |
Precision (%) | WOA | 98.366 | 95.695 | 98.684 | 91.848 | 95.157 | 60.0 | 100 | NaN |
SWO | 98.758 | 96.358 | 99.123 | 92.391 | 94.673 | 60.0 | 100 | NaN | |
ESWO | 98.758 | 96.689 | 99.561 | 92.12 | 95.4 | 60.0 | 100 | NaN | |
PSO | 98.401 | 95.854 | 99.670 | 92.455 | 93.238 | 67.347 | 97.959 | NaN | |
F1-score (%) | WOA | 98.142 | 95.854 | 99.119 | 92.603 | 93.349 | 67.347 | 97.959 | NaN |
SWO | 98.372 | 97.0 | 99.341 | 92.896 | 93.429 | 68.041 | 97.959 | NaN | |
ESWO | 98.533 | 97.333 | 99.671 | 92.623 | 93.810 | 67.347 | 97.959 | NaN | |
PSO | 96.476 | ||||||||
Accuracy (%) | WOA | 96.287 | |||||||
SWO | 96.602 | ||||||||
ESWO | 96.759 |
Methods | PSO | WOA | SWO | ESWO |
Computational time (s) | 87.980 | 46.033 | 26.758 | 30.253 |
Function expressions | Dimension | Range | Min |
$ {F}_{1}\left(X\right)={\sum }_{i=1}^{n}{\left({\sum }_{j=1}^{i}{X}_{j}\right)}^{2} $ | 30 | $ \left[\mathrm{100,100}\right] $ | 0 |
$ {F}_{2}\left(X\right)=\mathrm{m}\mathrm{a}\mathrm{x}\left\{{X}_{i}\right|, 1\ll i\ll n\} $ | 30 | $ \left[-\mathrm{100,100}\right] $ | 0 |
$ {F}_{3}\left(X\right)={\sum }_{i=1}^{n-1}\left[100{\left({x}_{i+1}-{x}_{i}^{2}\right)}^{2}+{\left({x}_{i}-1\right)}^{2}\right] $ | 30 | $ [-\mathrm{30, 30}] $ | 0 |
$ {F}_{4}\left(X\right)={\sum }_{i=1}^{n}{\left([{x}_{i}+0.5]\right)}^{2} $ | 30 | $ \left[-\mathrm{100,100}\right] $ | 0 |
$ {F}_{5}\left(X\right)=\left[{x}_{i}^{2}-10cos\left(2\pi {x}_{i}\right)+10\right] $ | 30 | $ [-\mathrm{5.12, 5.12}] $ | 0 |
$ {F}_{6}\left(X\right)=-20exp\left(-0.2\sqrt{\frac{1}{n}{\sum }_{i=1}^{n}{x}_{i}^{2}}\right)-exp\left(\frac{1}{n}{\sum }_{i=1}^{n}cos\left(2\pi {x}_{i}\right)\right)+20+e $ | 30 | $ [-\mathrm{32, 32}] $ | 0 |
$ {F}_{7}\left(X\right)=\frac{1}{4000}{\sum }_{i=1}^{n}{x}_{i}^{2}-{\prod }_{i=1}^{n}cos\left(\frac{{x}_{i}}{\sqrt{i}}\right)+1 $ | 30 | $ [-\mathrm{600,600}] $ | 0 |
$ {F}_{8}\left(X\right)=\frac{\pi }{n}\{10{sin}^{2}\left(\pi {y}_{i}\right)+{\sum }_{i=1}^{n-1}{\left({y}_{i}-1\right)}^{2}\left[1+10{sin}^{2}\left(\pi {y}_{i+1}\right)\right]+{({y}_{n}-1)}^{2}\}+{\sum }_{i=1}^{n}u\left({X}_{i}, \mathrm{10,100, 4}\right), $ $ {y}_{i}=1+\frac{1}{4}\left({X}_{i}+1\right) $ $ u\left({X}_{i}, a, k, m\right)=\left\{k(Xi−a)m,Xi>a0,−a≤Xi≤ak(−Xi−a)m,Xi<−a \right. $ |
30 | $ \left[-\mathrm{50, 50}\right] $ | 0 |
Function | Type | PSO | WOA | SWO | ESWO |
F1 | Min | 3.39 × 10+02 | 3.87 × 10+03 | 8.47 × 10–140 | 3.77 × 10–160 |
Ave | 3.13 × 10+03 | 1.97 × 10+04 | 1.52 × 10–76 | 1.87 × 10–106 | |
Std | 5.53 × 10+03 | 9.55 × 10+03 | 8.30 × 10–76 | 1.02 × 10–105 | |
F2 | Min | 6.45 × 10+00 | 1.37 × 10+00 | 8.41 × 10–70 | 2.70 × 10–88 |
Ave | 1.49 × 10+01 | 3.71 × 10+01 | 1.62 × 10–42 | 2.55 × 10–58 | |
Std | 4.38 × 10+00 | 2.47 × 10+01 | 8.86 × 10–42 | 1.40 × 10–57 | |
F3 | Min | 2.13 × 10+03 | 2.64 × 10+01 | 2.31 × 10+01 | 2.27 × 10+01 |
Ave | 3.67 × 10+04 | 2.71 × 10+01 | 2.39 × 10+01 | 2.35 × 10+01 | |
Std | 3.80 × 10+04 | 5.91 × 10–01 | 4.33 × 10–01 | 4.14 × 10–01 | |
F4 | Min | 1.46 × 10+02 | 5.17 × 10–03 | 3.98 × 10–08 | 5.50 × 10–08 |
Ave | 5.84 × 10+02 | 6.08 × 10–02 | 8.32 × 10–07 | 1.70 × 10–06 | |
Std | 2.90 × 10+02 | 5.85 × 10–02 | 7.91 × 10–07 | 2.17 × 10–06 | |
F5 | Min | 6.89 × 10+01 | 0.00 × 10+00 | 0.00 × 10+00 | 0.00 × 10+00 |
Ave | 1.14 × 10+02 | 7.58 × 10–15 | 0.00 × 10+00 | 0.00 × 10+00 | |
Std | 2.40 × 10+01 | 2.47 × 10–14 | 0.00 × 10+00 | 0.00 × 10+00 | |
F6 | Min | 5.60 × 10+00 | 4.44 × 10–16 | 4.44 × 10–16 | 4.44 × 10–16 |
Ave | 8.49 × 10+00 | 3.52 × 10–15 | 4.44 × 10–16 | 4.44 × 10–16 | |
Std | 1.42 × 10+00 | 2.42 × 10–15 | 0.00 × 10+00 | 0.00 × 10+00 | |
F7 | Min | 2.35 × 10+00 | 0.00 × 10+00 | 0.00 × 10+00 | 0.00 × 10+00 |
Ave | 5.80 × 10+00 | 6.64 × 10–03 | 0.00 × 10+00 | 0.00 × 10+00 | |
Std | 2.83 × 10+00 | 2.05 × 10–02 | 0.00 × 10+00 | 0.00 × 10+00 | |
F8 | Min | 3.06 × 10+00 | 5.40 × 10–04 | 2.08 × 10–09 | 1.21 × 10–09 |
Ave | 1.26 × 10+01 | 7.15 × 10–03 | 4.07 × 10–08 | 2.18 × 10–08 | |
Std | 7.37 × 10+00 | 6.45 × 10–03 | 4.14 × 10–07 | 1.06 × 10–08 |
Algorithms | Metrics | Ionosphere | WDBC | Heartstatlog | Sonar |
PSO | Accuracy (%) | 92.381 | 97.647 | 83.951 | 88.710 |
Recall (%) | 96.807 | 95.238 | 86.667 | 93.103 | |
Precision (%) | 90.411 | 98.361 | 84.783 | 84.375 | |
F1-score (%) | 94.286 | 96.774 | 85.714 | 88.525 | |
Computational time (s) | 9.124 | 10.934 | 12.013 | 10.730 | |
WOA | Accuracy (%) | 90.476 | 97.647 | 85.185 | 88.710 |
Recall (%) | 95.522 | 96.825 | 86.667 | 82.759 | |
Precision (%) | 90.141 | 96.825 | 84.783 | 92.308 | |
F1-score (%) | 92.754 | 96.825 | 85.714 | 87.273 | |
Computational time (s) | 4.278 | 5.343 | 5.892 | 4.942 | |
SWO | Accuracy (%) | 92.381 | 97.647 | 83.951 | 90.323 |
Recall (%) | 95.522 | 95.238 | 88.889 | 96.552 | |
Precision (%) | 92.754 | 98.361 | 83.333 | 84.848 | |
F1-score (%) | 94.118 | 96.774 | 86.022 | 90.323 | |
Computational time (s) | 3.318 | 3.959 | 4.149 | 3.612 | |
ESWO | Accuracy (%) | 93.333 | 99.412 | 87.654 | 96.774 |
Recall (%) | 98.507 | 98.413 | 91.111 | 93.103 | |
Precision (%) | 91.667 | 1 | 87.234 | 1 | |
F1-score (%) | 94.964 | 99.20 | 89.130 | 96.429 | |
Computational time (s) | 3.342 | 4.307 | 4.543 | 3.967 |
Metrics | PSO | WOA | SWO | ESWO |
Accuracy (%) | 98.667 | 98.567 | 98.683 | 98.833 |
Recall (%) | 99.037 | 98.916 | 98.856 | 98.916 |
Precision (%) | 98.562 | 98.501 | 98.767 | 98.976 |
F1-score (%) | 98.799 | 98.708 | 98.811 | 98.946 |
Number of features | 11 | 10 | 8 | 6 |
Computational time (sec) | 106.687 | 49.561 | 34.579 | 36.296 |
Data volumes | PSO | WOA | SWO | ESWO |
3000 | 0.03269 | 0.01511 | 0.01302 | 0.01079 |
6000 | 0.05890 | 0.27954 | 0.02319 | 0.01976 |
10,000 | 0.15484 | 0.03281 | 0.03551 | 0.03002 |
Normal | DoS | DDoS | Probe | BFA | Web-Attack | BOTNET | U2R | ||
PSO | 98.241 | 96.013 | 99.344 | 93.352 | 91.395 | 76.744 | 96.0 | 0 | |
Recall (%) | WOA | 97.918 | 96.013 | 99.558 | 93.37 | 91.608 | 76.744 | 96.0 | 0 |
SWO | 97.99 | 97.651 | 99.559 | 93.407 | 92.217 | 78.571 | 96.0 | 0 | |
ESWO | 98.308 | 97.987 | 99.78 | 93.132 | 92.272 | 76.744 | 96.0 | 0 | |
PSO | 98.562 | 95.695 | 99.342 | 91.576 | 95.157 | 60.0 | 100 | NaN | |
Precision (%) | WOA | 98.366 | 95.695 | 98.684 | 91.848 | 95.157 | 60.0 | 100 | NaN |
SWO | 98.758 | 96.358 | 99.123 | 92.391 | 94.673 | 60.0 | 100 | NaN | |
ESWO | 98.758 | 96.689 | 99.561 | 92.12 | 95.4 | 60.0 | 100 | NaN | |
PSO | 98.401 | 95.854 | 99.670 | 92.455 | 93.238 | 67.347 | 97.959 | NaN | |
F1-score (%) | WOA | 98.142 | 95.854 | 99.119 | 92.603 | 93.349 | 67.347 | 97.959 | NaN |
SWO | 98.372 | 97.0 | 99.341 | 92.896 | 93.429 | 68.041 | 97.959 | NaN | |
ESWO | 98.533 | 97.333 | 99.671 | 92.623 | 93.810 | 67.347 | 97.959 | NaN | |
PSO | 96.476 | ||||||||
Accuracy (%) | WOA | 96.287 | |||||||
SWO | 96.602 | ||||||||
ESWO | 96.759 |
Methods | PSO | WOA | SWO | ESWO |
Computational time (s) | 87.980 | 46.033 | 26.758 | 30.253 |