
Numerous limitations of Shot-based and Content-based key-frame extraction approaches have encouraged the development of Cluster-based algorithms. This paper proposes an Optimal Threshold and Maximum Weight (OTMW) clustering approach that allows accurate and automatic extraction of video summarization. Firstly, the video content is analyzed using the image color, texture and information complexity, and video feature dataset is constructed. Then a Golden Section method is proposed to determine the threshold function optimal solution. The initial cluster center and the cluster number k are automatically obtained by employing the improved clustering algorithm. k-clusters video frames are produced with the help of K-MEANS algorithm. The representative frame of each cluster is extracted using the Maximum Weight method and an accurate video summarization is obtained. The proposed approach is tested on 16 multi-type videos, and the obtained key-frame quality evaluation index, and the average of Fidelity and Ratio are 96.11925 and 97.128, respectively. Fortunately, the key-frames extracted by the proposed approach are consistent with artificial visual judgement. The performance of the proposed approach is compared with several state-of-the-art cluster-based algorithms, and the Fidelity are increased by 12.49721, 10.86455, 10.62984 and 10.4984375, respectively. In addition, the Ratio is increased by 1.958 on average with small fluctuations. The obtained experimental results demonstrate the advantage of the proposed solution over several related baselines on sixteen diverse datasets and validated that proposed approach can accurately extract video summarization from multi-type videos.
Citation: Yunyun Sun, Peng Li, Zhaohui Jiang, Sujun Hu. Feature fusion and clustering for key frame extraction[J]. Mathematical Biosciences and Engineering, 2021, 18(6): 9294-9311. doi: 10.3934/mbe.2021457
[1] | Divakar Dahiya, Hemant Sharma, Arun Kumar Rai, Poonam Singh Nigam . Application of biological systems and processes employing microbes and algae to Reduce, Recycle, Reuse (3Rs) for the sustainability of circular bioeconomy. AIMS Microbiology, 2022, 8(1): 83-102. doi: 10.3934/microbiol.2022008 |
[2] | Eman Koshlaf, Andrew S Ball . Soil bioremediation approaches for petroleum hydrocarbon polluted environments. AIMS Microbiology, 2017, 3(1): 25-49. doi: 10.3934/microbiol.2017.1.25 |
[3] | Pontsho Patricia Twala, Alfred Mitema, Cindy Baburam, Naser Aliye Feto . Breakthroughs in the discovery and use of different peroxidase isoforms of microbial origin. AIMS Microbiology, 2020, 6(3): 330-349. doi: 10.3934/microbiol.2020020 |
[4] | Yu Wang, Caroline A. Brown, Rachel Chen . Industrial production, application, microbial biosynthesis and degradation of furanic compound, hydroxymethylfurfural (HMF). AIMS Microbiology, 2018, 4(2): 261-273. doi: 10.3934/microbiol.2018.2.261 |
[5] | Nurul Alia Syufina Abu Bakar, Nur Aliyyah Khuzaini, Siti Baidurah . Co-fermentation involving Lysinibacillus sp. and Aspergillus flavus for simultaneous palm oil waste treatment and renewable biomass fuel production. AIMS Microbiology, 2022, 8(3): 357-371. doi: 10.3934/microbiol.2022025 |
[6] | Xiao Zhang, Jia Hua, Zule Song, Kejun Li . A review: Marine aquaculture impacts marine microbial communities. AIMS Microbiology, 2024, 10(2): 239-254. doi: 10.3934/microbiol.2024012 |
[7] | Cheng Zhen, Xian-Feng Ge, Yi-Ting Lu, Wen-Zheng Liu . Chemical structure, properties and potential applications of surfactin, as well as advanced strategies for improving its microbial production. AIMS Microbiology, 2023, 9(2): 195-217. doi: 10.3934/microbiol.2023012 |
[8] | Phu-Tho Nguyen, Tho-Thi Nguyen, Duc-Cuong Bui, Phuoc-Toan Hong, Quoc-Khanh Hoang, Huu-Thanh Nguyen . Exopolysaccharide production by lactic acid bacteria: the manipulation of environmental stresses for industrial applications. AIMS Microbiology, 2020, 6(4): 451-469. doi: 10.3934/microbiol.2020027 |
[9] | M. Azizur Rahman . Development of bioleaching: proteomics and genomics approach in metals extraction process. AIMS Microbiology, 2016, 2(3): 332-339. doi: 10.3934/microbiol.2016.3.332 |
[10] | Nilde Antonella Di Benedetto, Maria Rosaria Corbo, Daniela Campaniello, Mariagrazia Pia Cataldi, Antonio Bevilacqua, Milena Sinigaglia, Zina Flagella . The role of Plant Growth Promoting Bacteria in improving nitrogen use efficiency for sustainable crop production: a focus on wheat. AIMS Microbiology, 2017, 3(3): 413-434. doi: 10.3934/microbiol.2017.3.413 |
Numerous limitations of Shot-based and Content-based key-frame extraction approaches have encouraged the development of Cluster-based algorithms. This paper proposes an Optimal Threshold and Maximum Weight (OTMW) clustering approach that allows accurate and automatic extraction of video summarization. Firstly, the video content is analyzed using the image color, texture and information complexity, and video feature dataset is constructed. Then a Golden Section method is proposed to determine the threshold function optimal solution. The initial cluster center and the cluster number k are automatically obtained by employing the improved clustering algorithm. k-clusters video frames are produced with the help of K-MEANS algorithm. The representative frame of each cluster is extracted using the Maximum Weight method and an accurate video summarization is obtained. The proposed approach is tested on 16 multi-type videos, and the obtained key-frame quality evaluation index, and the average of Fidelity and Ratio are 96.11925 and 97.128, respectively. Fortunately, the key-frames extracted by the proposed approach are consistent with artificial visual judgement. The performance of the proposed approach is compared with several state-of-the-art cluster-based algorithms, and the Fidelity are increased by 12.49721, 10.86455, 10.62984 and 10.4984375, respectively. In addition, the Ratio is increased by 1.958 on average with small fluctuations. The obtained experimental results demonstrate the advantage of the proposed solution over several related baselines on sixteen diverse datasets and validated that proposed approach can accurately extract video summarization from multi-type videos.
The traveling salesman problem (TSP) is one of the most intensively studied combinatorial optimization problems in the field of graph theory and operations research. It is believed that the TSP was first studied by Hamilton and Kirkman [1]. Since then, many and various algorithms and approximation algorithms have been proposed.
Given a set of nodes (cities) and the pairwise cost (or travel distance), the TSP is to find the best possible cycle of visiting all the cities and returning to the starting point that minimize the total travel cost. The general solution algorithm that guarantees the shortest path has exponential complexity, and no known efficient algorithm exists to date, since TSP is also an NP-hard problem.
The TSP naturally finds many applications in the field of transportation and logistics, such as arranging school bus routes to pick up the children [2], scheduling stacker cranes in warehouses [3], and delivering ordered meals to the customers [4]. Slightly modified, TSP can be applied to some problems which are unlike the path finding problems at first sight. For example, in the DNA sequencing problem [5], the node represents DNA fragments, and distance cost is defined as the similarity measure between DNA fragments.
Various methods have been proposed to solve TSP, such as violence enumeration, dynamic programming (DP) [6], and Hopfield neural network (HNN) [7]. Violence enumeration, as the simplest method, has the time complexity O(n!). DP decomposes the problem into couples of sub-problems which has much smaller time complexity as O(n22n) [6]. As an intelligent method, Hopfield neural network algorithm(HNN) [7] handles the permutation matrix which is mapped to the legal solution of TSP, and minimizes the energy function intelligently. The drawback of this method is that it may suffer from slow convergence and low accuracy when used to solve the TSP [8].
Using swarm intelligence [9,10] to address TSP has become a new trend in the past decades. Many metaheuristics has been reported excelling in solving TSP, such as genetic algorithm (GA) [11,12], particle swarm optimization (PSO) [13,14], and ant colony optimization (ACO) [15]. Ant colony optimization (ACO) is proposed by Marco Dorigo in 1992, which is inspired by the ant colony behavior of searching food.
ACO is one of the most frequently studied algorithm for solving TSP. Fevrier Valdez et al. [16] implemented the algorithms elitist ant system (EAS) and rank based ant system (ASrank); Gao Shupeng et al. [17] proposed a pheromone initialization strategy; Zhang Yuru et al. [18] developed an improved 2-opt and hybrid algorithm based on ACO; Dewantoro et al. [19] used tabu search algorithm as the local search in ACO.
In recent years, as researchers made slow progress in developing better methods for TSP, some researchers shifted their interests to some promoted problems, such as vehicle routing problem (VRP) and multiple traveling salesman problem (MTSP). MTSP means that multiple salesmen only visit a certain number of cities once and only once, and then return to their original cities with the smallest total cost. MTSP is closely related to some optimization problems such as task assignment [20].
Optimization problems with multiple solutions widely exist in the real world, and TSP is no exception. In the TSP, the salesman prefers couples of optimal solutions at hands in case that some solutions become unavailable for some reasons. Ting Huang et al. [21] proposed a benchmark to study the performance of algorithms for the multi-solution TSP (MSTSP), though few algorithms had been proposed at that time. NMA [22] integrates niche strategies with the memetic algorithm to address the MSTSP. NGA [21] groups related individuals and performs the GA operations including crossover and mutation in each group. NACS [23] is based on the ACO, in which ants are guided to search for different areas through multiple pheromone matrices.
Since ACO is the most successful metaheuristics reported to solve TSP, we also use the ACO in this paper as the base optimization method to address the MSTSP. We use 2-opt strategy to improve the local search ability. The 2-opt strategy provides an effective search operation, while requires less computation than 3-opt or more. We propose a new distance metric named "circular Jaccard distance" instead of other frequently used distance metrics, which brings a considerable improvement in the optimization performance. The circular Jaccard distance is defined as the minimal distances between two paths in various transformation modes such as flip and circular shift. We test the proposed method on the benchmark problems of 25 MSTSP instances, with two evaluation criteria designed for MSTSP. Experimental results indicate that the proposed method substantially improves the performance of multi-solution optimization.
The rest of the article is organized as follows. Section 2 introduces TSP, ACO, and how to solve TSP using ACO. In Section 3, we propose a novel niching strategy, and introduce the 2-opt algorithm and circular Jaccard distance to it, thus proposing the CJD-MSO algorithm. Section 4 describes MSTSP benchmarks, and evaluates the CJD-MSO with experiments. Section 5 concludes the paper finally.
In this section, we first introduce TSP and the ACO algorithm. Then we introduce how to solve the TSP using ACO algorithm. Finally, we introduce the basic multi-solution optimization algorithm for MSTSP.
Let G=(V,E) be a graph with set of vertexes V and set of edges E={(x,y)|x,y∈V}. Each edge e∈E is assigned a cost ce. If the graph is directed, then the cost of (x,y) is equal to the cost of (y,x) for all (x,y)∈E. The equality does not hold for undirected graph.
Without loss of generality, we suppose graph G is a complete graph. Let the set of vertices be V={v1,v2,...,vn}. The cost matrix is given by C=(cij)n×n, where the cost of the edge from vi to vj is denoted cij.
In the context of the traveling salesman problem, the vertexes correspond to cities and the edges correspond to the path between those cities. Let H be the set of all Hamiltonian cycles which visit each vertex exactly once except for the starting point, in G. The traveling salesman problem (TSP) is to find a Hamiltonian cycle h∈H such that the total costs in the cycle is minimized.
The TSP may be formulated as an integer linear programming (ILP) problem [24]. Let
xij={1the path goes from city i to city j0otherwise. |
For i=0,1,⋯,n, define ui as an artificial variable. Then TSP can be rewritten as the following ILP problem:
min∑ni=0∑nj≠i,j=0cijxij0≤xij≤1i,j=0,⋯,nui∈Zi=0,⋯,n∑ni=0,i≠jxij=1j=0,⋯,n∑nj=0,j≠ixij=1i=0,⋯,nui−uj+nxij≤n−11≤i≠j≤n | (2.1) |
The first set of equalities requires each city to be arrived from exact one other city. The second set of equalities requires each city to be a departure to exact one other city. The last constraints with the artificial variable ui ensure that it is a Hamiltonian cycle. The ILP can be relaxed and solved as a linear programming (LP) using the simplex method. Other exact solution methods include the cutting plane method [25] and branch-and-cut [26].
The exact solution methods suffer from high computational complexity. Therefore, some metaheuristics such as ACO and GA were suggested for addressing the TSP. Ant colony optimization algorithm (ACO) is a swarm intelligence optimization algorithm, which is inspired by the foraging behavior of ants [15].
When the ants search for food, they wander around the nest randomly. If some ant finds food, it carries the food back to the nest and lays down pheromones along the road. Other ants nearby can perceive the pheromone and follow the path, and thus increasing the pheromone density. At the same time, the pheromone always evaporates. The more time it takes for an ant to travel along the path, the more time the pheromone evaporates. Hence the pheromone density becomes higher on the shorter path than the longer one. Other ants tend to take the short path and lay down new pheromones. Pheromones attracts more and more ants move on the shorter paths, and finally find the shortest path.
Figure 1 illustrates two scenarios about how the ants choose path through pheromones. Figure 1(a) contains two paths with the same length, it can be observed that most ants chose the upper path. The main reason is that once the upper path is first taken by some ant, it deposits pheromone on the path, which attracts other ants to choose the upper path. As the number of ants that choose the upper path increase, resulting in a higher pheromone density on the path than the lower one, more other ants tend to choose the upper path.
Figure 1(b) shows that most ants prefer the shorter path. Since the accumulation rate of pheromone on the short path is relatively high, the higher density of pheromone on the short path encourages more ants to choose the short path between the nest and the food.
ACO is a popular metaheuristic for addressing the TSP [27]. Next, we show how to use ACO to solve the TSP. At the beginning, m ants are randomly placed in n cities. The pheromone initial value on all paths between city and city are equal. The ant k (k=1,2,⋯,m) calculates the transition probability according to Eq (2.2), before choosing the next city.
Pk(i,j)={[τ(i,j)]α[η(i,j)]β∑u∈Jk(i)[τ(i,u)]α[η(i,u]βj∈Jk(i)0otherwise, | (2.2) |
where Pk(i,j) is the transition possibility that ant k moving from city i to city j; α is used to control the concentration of pheromone; β is used to control the role of heuristic information; τ(i,j) is pheromones on edges (i, j); η(i,j) is the heuristic factor for moving from city i to city j; Jk(i) is a collection of candidate cities which ant k is allowed to visit next.
The Rk is used to record the vertexes which the ant k has visited. When m ants have completed the tour and go back to the departure vertex, the total cost of the n edges traveled by each ant is calculated. Then, the minimum cost is saved, and the pheromone on each edge is updated. During the iteration, the amount of pheromone remaining on the path after evaporating is calculated as follows:
τ(i,j)=(1−ρ)⋅τ(i,j)+m∑k=1Δτk(i,j), | (2.3) |
where ρ, between 0 and 1, is pheromone volatilization coefficient. Δτk(i,j) is the pheromones released by the k ant during this tour, which is defined as:
Δτk(i,j)={(Ck)−1(i,j)∈Rk0otherwise | (2.4) |
where Ck represents the path length of the ant k.
Algorithm1 presents the pseudocode.
Algorithm 1 Solving TSP using the ACO algorithm |
Input: Number of ants m, pheromone factor α, heuristic function factor β, pheromone volatilization coefficient ρ, and maximum number of iterations Output: An optimal solution 1: Initialize ant colony parameters and pheromone matrix 2: while not meet the stopping criteria do 3: Randomly place each ant at a different starting point 4: For each ant k (k=1,2,⋯,m), calculate the next vertex to be visited using the transition probability formula, until all ants have visited all the vertexes 5: Calculate the path length L passed by each ant, and save the optimal solution (shortest path) 6: Update the pheromone concentration on all paths 7: end while |
Most swarm intelligence optimization algorithms are developed to solve the single-solution optimization problems. It is necessary to introduce some strategies to make these algorithms suitable for solving multi-solution optimization problems.
Niching method is one of the most widely used strategies. Niching method is a category of techniques to prevent population convergence to a single optimum by maintaining multiple niches. It should be mentioned that some improved ACO versions and various other metaheuristics may have good mechanism for exploration. Although good exploration ability promotes diversity of the solution, it is preferable to use the niching method for maintaining multiple niches.
The classic niching methods were developed in the early 70s such as crowding method [28]. After that, various niching methods were developed including fitness sharing [29], clearing [30], RTS [31], speciation [32] and clustering [33,34].
Fitness sharing [29] is inspired by the sharing phenomenon in nature. The resource in a niching is limited, so individuals have to share their resources with other individuals occupying the same niching. When there are too many individuals in a niche, the fitness of the individual will be reduced, thus leading to reduce the number of individuals. Or, conversely, a small number of individuals in a niche may increase the population size. Fitness is adjusted by the population size in a niche, thus facilitating finding multiple optimal solutions.
Crowding [28] is a technique to maintain diversity during the course of iteration through careful population replacement. First, a number of individuals are selected from the parent generation. The number of individuals are determined by a crowding factor (CF). Then, the new individuals most similar to parent individuals are selected to replace parent individuals. Since the population is initialized with evenly distribution, the evenly distribution can be maintained during the course of iteration. Algorithm 2 depicts the procedure of crowding.
Algorithm 2 Crowding |
Input: Population NP Output: A set of optimal solutions 1: Randomly generate individuals of NP and a set of CF values 2: while not meet the stopping criteria do 3: Calculate the fitness of each individual 4: Generate candidate individuals through the basic operation of the algorithm 5: Randomly select CF individuals from the parents 6: Compare the candidate individual with the closest individual among CF individuals, and replace the closest individual if the candidate individual has better fitness 7: end while |
Clearing [30] is a strategy easy to implement. In each iteration, the current optimal individual in the population is selected as the center. Then, all the individuals within the specified clearance radius are removed from the population. The procedure repeats until the original population has no individual, thus obtaining a set of optimal solutions finally.
The principle of speciation is to divide the population into several niches, and the populations in each niche evolve parallelly. Algorithm 3 presents the pseudocode of speciation.
Algorithm 3 Speciation |
Input: Population and niching radius σ Output: A set of optimal solutions 1: Initialize population 2: Calculate the fitness of the population and sort them according to the fitness value 3: while population is not empty do 4: Find the best individual seed in the population 5: Find all individuals whose distance to seed is less than σ, and form a subpopulation with seed 6: Remove the subpopulation from the population 7: end while |
The goal of clustering [33] is to group individuals into clusters such that individuals in each cluster have greater similarity than inter-clusters. K-means algorithm is a typical clustering algorithm used in niching method, where K represents the number of clusters [34]. Algorithm 4 is the pseudocode of K-means algorithm for niching.
Algorithm 4 K-means algorithm for niching |
Input: Population NP and clustering number K Output: K species 1: Initialize population 2: Randomly select K individuals from the population as the centers 3: while not meet the stopping criteria do 4: Calculate the distance between other individuals in the population and these K centers 5: Assign other individuals to the nearest cluster according to distance 6: Recalculate and update the centers in the population 7: end while |
The similarity metric plays an essential role in multi-solution optimization. As shown in Algorithm 5, whether the new individual is discarded or not is up to the similarity between the new path and the old one. Next, we introduce couples of similarity metrics which may be used in the multi-solution optimization.
Algorithm 5 The multi-solution optimization algorithm with distancing |
Input: The number of ants, iteration bound, α, β, and ρ Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Construct a path through pseudo-random selection rules to obtain the first generation parent population 3: while not meet the stopping criteria do 4: Discard new individual that have a high similarity to the previous one, and form a new offspring population from the remaining individuals 5: Update the pheromone matrix 6: Calculate the fitness of the parent population together with the offspring population, and sort them 7: Select a set of good individuals from the queue head to form a new parent population 8: end while |
Pearson correlation coefficient (PCC) [35], or the bivariate correlation, is a widely used measure of the similarity between vectors. If there are two variables: X and Y, PCC can be formulated as follows:
ρ(X,Y)=E[(X−μX)(Y−μY)]√∑ni=1(Xi−μX)2√∑ni=1(Yi−μY)2 | (2.5) |
where E is the mathematical expectation; μX is the expectation of the n-dimensional vector X; μY is the expectation of Y.
The greater the absolute value of the correlation coefficient is, the stronger correlation they have. The correlation with PCC 0 is the weakest correlation.
Distance correlation coefficient (DCC) [36], or distance covariance, is a measure of dependence between two arbitrary vectors, not necessarily equal in dimension. The DCC is zero if and only if the vectors are independent. The formula for DCC is as follows:
ρXY=E[(X−E(X))(Y−E(Y))]√D(X)√D(Y), | (2.6) |
where E is the mathematical expectation, D is the variance. The similarity between X and Y is closer if the DCC is greater.
Distance correlation (DC) defined in [36] is the complement set of DCC. It can be formulated as follows:
DXY=1−ρXY | (2.7) |
Therefore, the similarity between X and Y is closer if their DC is smaller.
Two vectors with more common edges should have stronger similarity. Suppose P1 and P2 represent two vectors, the common edge similarity(CES) is defined as follows:
similarity(P1,P2)=sharededgesnumberofcities | (2.8) |
The CES similarity is between 0 and 1. Only when the similarity equals to 1, does it mean that the two vectors are identical.
The Jaccard index [37], or the Jaccard similarity coefficient, is widely used for comparing the similarity or difference between limited sample sets. Given two sets A and B, the Jaccard coefficient is defined as the ratio of the intersection to the union of A and B, namely,
J(A,B)=|A∩B||A∪B|=|A∩B||A|+|B|+|A∩B|. | (2.9) |
The Jaccard index has value between 0 and 1. Specifically, when both A and B are empty sets, their Jaccard index is defined as 1.
Jaccard distance [38] is also a metric used to measure the difference between two sets. It is the complement index of Jaccard similarity coefficient. The formula of the Jaccard distance is as follows:
dj(A+B)=1−J(A,B)=AΔB|A∪B|, | (2.10) |
where the symmetric difference AΔB=|A∪B|−|A∩B|.
When the Jaccard distance is used to measure the similarity, the result of the completely identical calculation equals to 0, and the result of the completely different calculation equals to 1.
In this section, we first propose a novel niching strategy for multi-solution optimization. Next, we propose a circular Jaccard distance as the similarity metric which plays an important role in our algorithm. Then, we introduce the 2-opt strategy which is used as the local search method. Finally, we summarize the whole multi-solution optimization algorithm.
We develop a novel niching method based on the "radius" strategy which is used in the clearing strategy. After initializing a population of path randomly, we select a set of paths from the population with a given ratio, which is regarded as the parent population. In each iteration, a batch of new individual is generated following the rule of the base metaheuristics. For each new individual, if it has a high similarity to the previous one, then the new individual will be discarded. Otherwise, the new individual is saved in the population pool as the offspring candidate. We calculate the fitness of the parent population together with the offspring population, and sort them in ascending order. Then, a number of good individuals are chosen from the queue head to form a new parent population. The strategy requires that the individual in the next generation keeps a certain distance from the current generation. As shown in Figure 2, let 1, 2 and 3 (or 3' or 3") be the possible positions of an individual in the 1st, 2nd and 3rd generation. 2 is a step from 1. And then 3 (or 3' or 3") is a step from 2 ⋯⋯. With high probability, the distance from 3 (or 3' or 3") to 1 is greater than the distance between 2 to 1. Hence it maintains diversity during the course of search.
The procedure repeats until the stopping criteria are met and the optimal set of individuals is obtained finally. The proposed niching method is different from any other niching method proposed previously, such as crowding, fitness sharing, and clearing. And it is easy to implement compared with clearing. We name it as "distancing".
Algorithm 5 gives the pseudocode of the distancing method we proposed. In Algorithm 5, ACO is the base metaheuristics.
The similarity metrics mentioned previously, such as Jaccard distance, are used to promote diversity of the population during the course of iteration, which is also the key measure of quality of the multiple solutions. The valid path in the TSP is always a Hamiltonian cycle. If we use the similarity metrics for vectors in MSTSP, then the property of cycle is not explored and utilized fully.
We take an example to reveal the limitation of Jaccard distance. Suppose there are three paths: "0-6-5-3-7-8-2-4-1" (path 1), "0-1-4-2-8-7-3-5-6" (path 2) and "2-8-7-3-5-6-0-1-4" (path 3). The Jaccard distance between path 1 and path 2 equals to 1. The Jaccard distance between path 2 and path 3 also equals to 1. The Jaccard distance between path 1 and path 3 also equals to 0.889. However, the path 1, 2 and 3 are all the same for the TSP. Therefore, we propose a novel metric named Circular Jaccard Distance (CJD) in this paper to overcome the limitation.
The CJD respects the invariance of the path, which means the paths transformed by cycling or flipping are regarded the same as the original path. When calculating the CJD, first, one of the two paths is transformed by cycling and flipping. Then the Jaccard distance between each pairs are calculated. Finally, the CJD is obtained by minimizing these JDs.
For example, suppose the original path is "0-1-4-2-8-7-3-5-6". The new generated path is the path "6-5-3-8-2-4-1-0-7". By transforming the second path, one can obtain "6-5-3-8-2-4-1-0-7", "5-3-8-2-4-1-0-7-6", "3-8-2-4-1-0-7-6-5", "8-2-4-1-0-7-6-5-3", "2-4-1-0-7-6-5-3-8", "4-1-0-7-6-5-3-8-2", "1-0-7-6-5-3-8-2-4", "0-7-6-5-3-8-2-4-1", "7-6-5-3-8-2-4-1-0", "6-7-0-1-4-2-8-3-5", "5-6-7-0-1-4-2-8-3", "3-5-6-7-0-1-4-2-8", "8-3-5-6-7-0-1-4-2", "2-8-3-5-6-7-0-1-4", "4-2-8-3-5-6-7-0-1", "1-4-2-8-3-5-6-7-0", "0-1-4-2-8-3-5-6-7" and "7-0-1-4-2-8-3-5-6". As shown in Figure 3, Figure 3(a) is the original path. Figure 3(b) is the path obtained by clockwise or counter-clockwise rotation. Figure 3(c) is the path obtained by flipping. The CJDs between the converted paths and the original path are 1, 0.78, 1, 0.67, 1, 0.78, 1, 1, 0.89, 1, 1, 1, 1, 0.89, 1, 1, 0.5 and 0.67 respectively.
In optimization, n-opt is a simple local search method for solving combinatorial optimization problems [39]. The n-opt treatment involves deleting n edges in a tour, reconstructing the path in some other possible ways.
Take the 2-opt as example, as shown in Figure 4, Figure 4(a) is the original path. In every 2-opt strategy, two edges of the solution are deleted and two new edges are added to obtain a new path. If the new path is better, the original path is replaced. Specifically, since d(V1,V3)+d(V2,V4)>d(V1,V2)+d(V3,V4), the edges (V1,V3) and (V2,V4) are removed, and the edges (V1,V2) and (V3,V4) are added instead, thus forming a new path, see Figure4(a). Here d represents the distance or similarity. Again, in Figure 4(c), since d(V1,V7)+d(V4,V5)>d(V1,V5)+d(V4,V7), the edges (V1,V7) and (V4,V5) are removed, and the edges (V1,V5) and (V4,V7) are added instead, see Figure 4(b).
The 3-opt strategy is able to create more choices of new paths. However, the computational complexity becomes O(n3) which is not affordable for most applications. Therefore, we use 2-opt as the main local search method in this paper.
In this paper, we propose a circular Jaccard distance based multi-solution optimization (CJD-MSO) algorithm to improve the performance for solving the traveling salesman problem.
First, we initialize pheromone matrix, randomly place the ants to the initial cities, calculate the next city according to the transition probabilities, and thus obtain the first generation parent population. In the iterative process, the 2-opt algorithm is used to improve the quality of the solution and the distancing strategy is used to maintain diversity. The circular Jaccard distance between each new individual with the last individual path is calculated. If the similarity value is higher than a preset value, which means that two paths are significantly different, then it is retained. Otherwise, the new individual is discarded. The parent population and offspring population are mixed together, and a set of good individuals with higher fitness is chosen to be the next generation of parent population. Algo.6 shows the entire process of CJD-MSO.
Algorithm 6 The CJD-MSO algorithm |
Input: The number of ants m, iteration bound, and the parameters of CJD-MSO (α, β and ρ) Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Randomly place each ant at a different starting point 3: For each ant k(k=1,2,…,m), calculate the next city to be visited according to the transition formula obtained by the roulette method, until all ants have visited all the cities 4: Obtain the first generation parent population through the 2-opt strategy 5: while not meet the stopping criteria do 6: Randomly place each ant at a different starting point 7: For each ant k(k=1,2,…,m), calculate the next city to be visited according to the transition probability formula obtained by the roulette method, until all ants have visited all the cities 8: For each new population, 2-opt algorithm is used to improve the quality of solutions 9: Calculating the circular Jaccard distance between the new individual path and latest individual path 10: if the similarity is higher than a preset value then 11: Save the new individual path 12: else 13: Discard the new individual path 14: end if 15: Obtain the offspring population, and update the pheromone matrix 16: Mix the parent population with the offspring population 17: Sort them and select some good individuals from the queue to form the next parent population 18: Save the optimal paths obtained so far 19: end while |
In this section, first we describe the benchmark TSPs on which we shall test our proposed algorithm. Next, we introduce the evaluation criteria we shall use. Then, we exhibit our experimental results including the performance of CJD-MSO and some comparison details.
Ting Huang [21] proposed a benchmark set for MSTSPs in 2018, which contains 25 MSTSP instances, as shown in Table 1. In the benchmark set, MSTSPs are classified into three categories: simple MSTSPs (from MSTSP1 to MSTSP6), geometry MSTSPs (from MSTSP7 to MSTSP12), and composite MSTSPs (from MSTSP13 to MSTSP25). The number of cities in the MSTSPs ranges from 9 to 66. And the number of optimal scales ranges from 2 to 196. We test our algorithm on the benchmark set.
Name | City nums | Optimal solutions nums | Shortest path length | Name | City nums | Optimal solutions nums | Shortest path length |
MSTSP1 | 9 | 3 | 680 | MSTSP14 | 34 | 16 | 3575 |
MSTSP2 | 10 | 4 | 1265 | MSTSP15 | 22 | 72 | 9455 |
MSTSP3 | 10 | 13 | 832 | MSTSP16 | 33 | 64 | 8761 |
MSTSP4 | 11 | 4 | 803 | MSTSP17 | 35 | 10 | 9061 |
MSTSP5 | 12 | 2 | 754 | MSTSP18 | 39 | 20 | 23763 |
MSTSP6 | 12 | 4 | 845 | MSTSP19 | 42 | 20 | 14408 |
MSTSP7 | 10 | 56 | 130 | MSTSP20 | 45 | 20 | 10973 |
MSTSP8 | 12 | 110 | 1344 | MSTSP21 | 48 | 4 | 6767 |
MSTSP9 | 10 | 4 | 72 | MSTSP22 | 55 | 9 | 10442 |
MSTSP10 | 10 | 4 | 72 | MSTSP23 | 59 | 10 | 24451 |
MSTSP11 | 10 | 14 | 78 | MSTSP24 | 60 | 36 | 9614 |
MSTSP12 | 15 | 196 | 130 | MSTSP25 | 66 | 26 | 9521 |
MSTSP13 | 28 | 70 | 3055 |
We use two evaluation criteria to test our work: Fβ and the diversity indicator (DI) [21].
Fβ is an indicator calculated from the precision value P and the recall value R. The formula is as follows:
Fβ=(1+β)2⋅P⋅Rβ2⋅P+R, | (4.1) |
where P represents the proportion of optimal solutions, and R represents the proportion of the found optimal solutions to known optimal solutions. When β is set to 1, P and R have the same importance. We set β as 0.3 to attach more importance to the precision.
The formula of P and R are as follows:
P=TPTP+FP, | (4.2) |
R=TPTP+FN. | (4.3) |
The true positive, TP, represents the number of optimal solutions obtained by the algorithm; The false positive, FP, represents the number of non-optimal solutions obtained by the algorithm; The false negative, FN, represents the optimal solutions that the algorithm did not find.
The P, R, and Fβ are real values between 0 and 1. If the solutions obtained by the algorithm are all optimal solutions, then P equals to 1. If the algorithm finds all known optimal solutions, then R equals to 1. Fβ is 1, if the values of P and R both are 1. Larger Fβ means better performance.
The DI helps to assess the diversity performance of the multiple solutions. DI is defined by the average maximum similarity between the obtained solutions and the ground-truth solutions. The formula of DI is as follows:
DI(P,S)=∑|P|i=1maxj=1,...,|S|S(Pi,Sj)|P|, | (4.4) |
where P represents the known optimal solution set; S represents the solution set obtained by the algorithm; Pi represents the i-th solution in the set P; Sjrepresents the j-th solution in the set S; and S(Pi,Sj) represents the similarity. Larger DI means that more solutions have been found, given the known optimal solutions.
In this section, we conduct a series of numerical experiments to evaluate the proposed algorithm. Firstly, we test the performance of ACO in finding single optimal solution for the MSTSP. Secondly, we compare the algorithmic performance with various similarity metrics including PCC, DC, CES and JD. Thirdly, we compare the algorithmic performance with circular Jaccard distance to that with JD. Finally, we compare CJD-MSO with NACS [23], NGA [21], NMA [22], and give the values of Fβ and DI.
The parameter settings are as follows: α=1.0,β=2.0,ρ=0.5. The threshold value of the CJD is set to 0.4. All algorithms are implemented in Python language and run on the Mac OS Catalina system.
We use the MSTSP instances to test the ACO algorithm. Although the MSTSP instances are designed for testing multi-solution optimization algorithms, it is also eligible for testing the single solution optimization algorithms.
Figure 5 presents the convergence curves of ACO. For simple and geometry MSTSPs, the ACO converges to the optimal solution rapidly. The ACO without niching method can find only one optimum in a single run. For some composite MSTSPs such as Figure 5(c), the ACO cannot find any optimum within 3000 iterations, which indicates that composite MSTSPs are really hard problems.
Next, we improve the original ACO by adding "distancing" strategy. This time, we find couples of different solutions for MSTSPs including the simple MSTSPs, the geometry MSTSPs, and the composite MSTSPs, as shown in Figure 6.
We compare the performance of Fβ and DI on MSTSP among various similarity metrics including PCC, DC, CES and JD.
As shown in Figure 7 and Table 2, the four similarity metrics work well in the simple MSTSPs. For geometry MSTSPs and composite MSTSPs, JD considerably outperforms the other three methods.
NAME | PCC | DC | CES | JD | NAME | PCC | DC | CES | JD |
MSTSP1 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP14 | 0.813 | 0.905 | 0.766 | 0.905 |
MSTSP2 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP15 | 0.684 | 0.804 | 0.679 | 0.795 |
MSTSP3 | 0.835 | 0.835 | 0.993 | 0.835 | MSTSP16 | 0.629 | 0.748 | 0.586 | 0.760 |
MSTSP4 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP17 | 0.325 | 0.650 | 0.382 | 0.650 |
MSTSP5 | 1.000 | 1.000 | 0.907 | 1.000 | MSTSP18 | 0.000 | 0.000 | 0.224 | 0.000 |
MSTSP6 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.700 | 0.237 | 0.700 |
MSTSP7 | 0.751 | 0.690 | 0.879 | 0.690 | MSTSP20 | 0.160 | 0.130 | 0.172 | 0.520 |
MSTSP8 | 0.619 | 0.520 | 0.801 | 0.520 | MSTSP21 | 0.813 | 0.813 | 0.120 | 0.813 |
MSTSP9 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.000 | 0.000 |
MSTSP10 | 0.929 | 1.000 | 1.000 | 0.591 | MSTSP23 | 0.000 | 0.333 | 0.000 | 0.000 |
MSTSP11 | 0.941 | 0.982 | 0.967 | 0.983 | MSTSP24 | 0.188 | 0.207 | 0.118 | 0.313 |
MSTSP12 | 0.291 | 0.292 | 0.671 | 0.315 | MSTSP25 | 0.000 | 0.069 | 0.000 | 0.000 |
MSTSP13 | 0.743 | 0.775 | 0.743 | 0.950 | BRPS | 7 | 14 | 9 | 15 |
For MSTSPs with more optimal solutions or more cities, such as MSTSP18 and MSTSP22, since the Fβs almost drop to 0 for all similarity metrics. We compare their DIs in Figure 8.
From Figure 8, one can observe that JD considerably outperforms PCC and DC, and is slightly better than CES. So, we believe that JD is a good similarity metric for solving the MSTSPs. Next, we compare the performance of CJD with that of JD. Table 3 gives Fβ value of the algorithms with CJD and CD.
NAME | JD | CJD | NAME | JD | CJD |
MSTSP1 | 1.000 | 1.000 | MSTSP14 | 0.905 | 0.942 |
MSTSP2 | 1.000 | 1.000 | MSTSP15 | 0.795 | 0.823 |
MSTSP3 | 0.835 | 0.875 | MSTSP16 | 0.760 | 0.760 |
MSTSP4 | 1.000 | 1.000 | MSTSP17 | 0.650 | 0.710 |
MSTSP5 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.000 |
MSTSP6 | 1.000 | 1.000 | MSTSP19 | 0.700 | 0.642 |
MSTSP7 | 0.690 | 0.782 | MSTSP20 | 0.520 | 0.349 |
MSTSP8 | 0.520 | 0.752 | MSTSP21 | 0.813 | 0.813 |
MSTSP9 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 |
MSTSP10 | 0.591 | 1.000 | MSTSP23 | 0.000 | 0.175 |
MSTSP11 | 0.983 | 0.988 | MSTSP24 | 0.313 | 0.299 |
MSTSP12 | 0.315 | 0.547 | MSTSP25 | 0.000 | 0.062 |
MSTSP13 | 0.950 | 0.940 | BRPS | 12 | 19 |
From Figure 9 and Table 3, we can observe that CJD significantly outperforms JD, indicating that CJD facilitates maintaining the diversity of the population.
Finally, we perform experiment to compare CJD-MSO with NGA [21], NMA [22] and NACS [23]. Table 4 compares the Fβ value of these algorithms. The best Fβ for each MSTSP instance is marked in bold. BRPS represents the total of bold items.
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.684 | 0.973 | 1.000 | 1.000 | MSTSP14 | 0.087 | 0.172 | 0.883 | 0.942 |
MSTSP2 | 0.804 | 0.959 | 1.000 | 1.000 | MSTSP15 | 0.004 | 0.416 | 0.732 | 0.823 |
MSTSP3 | 0.497 | 0.936 | 1.000 | 0.875 | MSTSP16 | 0.000 | 0.054 | 0.554 | 0.760 |
MSTSP4 | 0.724 | 0.932 | 1.000 | 1.000 | MSTSP17 | 0.000 | 0.044 | 0.605 | 0.710 |
MSTSP5 | 0.989 | 0.846 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.031 | 0.571 | 0.000 |
MSTSP6 | 0.643 | 0.877 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.007 | 0.168 | 0.642 |
MSTSP7 | 0.125 | 0.769 | 0.923 | 0.782 | MSTSP20 | 0.000 | 0.000 | 0.165 | 0.349 |
MSTSP8 | 0.137 | 0.578 | 0.772 | 0.752 | MSTSP21 | 0.012 | 0.000 | 0.023 | 0.813 |
MSTSP9 | 0.768 | 0.974 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.013 | 0.000 |
MSTSP10 | 0.813 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.000 | 0.000 | 0.016 | 0.175 |
MSTSP11 | 0.459 | 0.949 | 1.000 | 0.988 | MSTSP24 | 0.000 | 0.000 | 0.010 | 0.299 |
MSTSP12 | 0.090 | 0.331 | 0.535 | 0.547 | MSTSP25 | 0.000 | 0.000 | 0.002 | 0.062 |
MSTSP13 | 0.025 | 0.096 | 0.611 | 0.940 | BRPS | 0 | 0 | 13 | 19 |
We use the pseudo-color plot (Figure 10) to exhibit the comparison results. Only eight "+" exists, which indicates that CJD-MSO perform very well compared with other multi-solution optimization algorithms.
We calculate the DI of CJD-MSO, and compare with other algorithms in Table 5 and Figure 11. For simple MSTSPs and geometry MSTSPs, the DIs of CJD-MSO are stably good. Seven MSTSPs have a DI value of 1, and the left MSTSPs are approximately 1. From MSTSP19 to MSTSP25, CJD-MSO significantly outperforms other algorithms too.
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.788 | 0.980 | 1.000 | 1.000 | MSTSP14 | 0.876 | 0.844 | 0.981 | 0.988 |
MSTSP2 | 0.894 | 0.972 | 1.000 | 1.000 | MSTSP15 | 0.744 | 0.847 | 0.932 | 0.902 |
MSTSP3 | 0.757 | 0.957 | 1.000 | 0.916 | MSTSP16 | 0.680 | 0.783 | 0.926 | 0.918 |
MSTSP4 | 0.809 | 0.947 | 1.000 | 1.000 | MSTSP17 | 0.765 | 0.803 | 0.944 | 0.881 |
MSTSP5 | 0.983 | 0.916 | 1.000 | 1.000 | MSTSP18 | 0.671 | 0.704 | 0.932 | 0.737 |
MSTSP6 | 0.843 | 0.943 | 1.000 | 1.000 | MSTSP19 | 0.675 | 0.699 | 0.865 | 0.862 |
MSTSP7 | 0.566 | 0.869 | 0.945 | 0.865 | MSTSP20 | 0.745 | 0.671 | 0.865 | 0.826 |
MSTSP8 | 0.652 | 0.838 | 0.898 | 0.838 | MSTSP21 | 0.773 | 0.628 | 0.834 | 0.870 |
MSTSP9 | 0.820 | 0.975 | 1.000 | 1.000 | MSTSP22 | 0.713 | 0.409 | 0.816 | 0.874 |
MSTSP10 | 0.850 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.671 | 0.344 | 0.829 | 0.839 |
MSTSP11 | 0.758 | 0.963 | 1.000 | 0.990 | MSTSP24 | 0.724 | 0.319 | 0.811 | 0.897 |
MSTSP12 | 0.732 | 0.809 | 0.871 | 0.857 | MSTSP25 | 0.725 | 0.270 | 0.747 | 0.859 |
MSTSP13 | 0.752 | 0.792 | 0.922 | 0.983 |
The results of the proposed method is based on the repeated experiments. Figure 12 gives the box plots of Fβ, DI, and number of found solutions on 25 MSTSPs.
In this article, we proposed a new multi-solution optimization algorithm named "CJD-MSO" to address the MSTSPs. In the algorithm, we proposed a novel niching technique named "distancing" method, together with a novel similarity metric named "circular Jaccard distance", to maintain population diversity during the course of searching multiple optimal solutions. The "distancing" strategy, different from all niching method emerge so far, were easy to implement. And " circular Jaccard distance" played an essential role in improving the performance of multi-solution optimization.
We also employed basic ACO as the base metaheuristic and employed the 2-opt strategy as the local search method in CJD-MSO. We laid greater emphasis on the strategy of niching and metric definition than compare many and various classic metaheuristics or their improved versions. Experimental results in 25 MSTSP instances showed that CJD-MSO significantly outperforms other multi-solution optimization algorithms, in terms of Fβ and DI.
We believe that the " circular Jaccard distance" metric and the "distancing" strategy can be applied to other base metaheuristics for addressing some other combinatorial optimization problems.
This work was supported by Ministry of Education in China (MOE) under project grants of Humanities and Social Sciences (No.21YJAZH040).
The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.
[1] |
A. G. Money, H. Agius, Video summarization: A conceptual framework and survey of the state of the art, J. Visual Commun. Image Represent., 19 (2008), 121–143. doi: 10.1016/j.jvcir.2007.04.002
![]() |
[2] |
W. Li, D. Qi, C. Zhang, J. Guo, J. Yao, Video summarization based on mutual information and entropy sliding window method, Entropy, (2020), 22. doi: 10.3390/e22111285. doi: 10.3390/e22111285
![]() |
[3] |
X. Yan, S. Z. Gilani, M. Feng, L. Zhang, H. Qin, A. Mian, Self-supervised learning to detect key frames in videos, Sensors, (2020), 20. doi: 10.3390/s20236941. doi: 10.3390/s20236941
![]() |
[4] |
C. Huang, H. Wang, A novel key-frames selection framework for comprehensive video summarization, IEEE Trans. Circuits Syst. Video Technol., 30 (2020), 577–589. doi: 10.1109/TCSVT.2019.2890899
![]() |
[5] |
I. Mehmood, S. Rho, S. W. Baik, Divide-and-conquer based summarization framework for extracting affective video content, Neurocomputing, 174 (2016), 393–403. doi: 10.1016/j.neucom.2015.05.126
![]() |
[6] |
G. H. Song, Q. G. Ji, Z. M. Lu, Z. D. Fang, Z. H. Xie, A novel video abstraction method based on fast clustering of the regions of interest in key frames, AEU Int. J. Electron. Commun. 68 (2014), 783–794. doi: 10.1016/j.aeue.2014.03.004
![]() |
[7] |
H. Rachida, E. Abdessamad, A. Karim, MSKVS: Adaptive mean shift-based keyframe extraction for video summarization and a new objective verification approach, J. Visual Commun. Image Represent., 55 (2018), 179–200. doi: 10.1016/j.jvcir.2018.06.002
![]() |
[8] |
G. Ciocca, R. Schettini, Erratum to: An innovative algorithm for key frame extraction in video summarization, J. Real Time Image Process., 8 (2013), 225. doi: 10.1007/s11554-012-0278-1
![]() |
[9] |
H. S. Chang, S. Sull, S.U. Lee, Efficient video indexing scheme for content-based retrieval, IEEE Transact. Circ. Syst. Video Technol., 9 (1999), 1269–1279. doi: 10.1109/76.809161
![]() |
[10] |
S. K. Kuanar, R. Panda, A. S. Chowdhury, Video key frame extraction through dynamic Delaunay clustering with a structural constraint, J. Visual Commun. Image Represent., 24 (2013), 1212–1227. doi: 10.1016/j.jvcir.2013.08.003
![]() |
[11] |
W. Jiang, M. Fei, Z. Song, W. Mao, New fusional framework combining sparse selection and clustering for key frame extraction, Iet Computer Vision, 10 (2016), 280–288. doi: 10.1049/iet-cvi.2015.0237
![]() |
[12] | D. J. Jeong, H. J. Yoo, N. I. Cho, Consumer video summarization based on image quality and representativeness measure, 2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP), (2015), pp. 572–576. |
[13] |
Y. Yin, R. Thapliya, R. Zimmermann, Encoded semantic tree for automatic user profiling applied to personalized video summarization, IEEE Transact. Circ. Syst. Video Technol., 28 (2018), 181–192. doi: 10.1109/TCSVT.2016.2602832
![]() |
[14] |
N. Ejaz, I. Mehmood, S. W. Baik, Efficient visual attention-based framework for extracting key frames from videos, Signal Process. Image Commun., 28 (2013), 34–44. doi: 10.1016/j.image.2012.10.002
![]() |
[15] |
P. Zheng, L. Shuai, S. A. Kumar, M. Khan, Visual attention feature (VAF): A novel strategy for visual tracking based on cloud platform in intelligent surveillance systems, J. Parallel Distrib. Comput., 120 (2018), 182–194. doi: 10.1016/j.jpdc.2018.06.012
![]() |
[16] |
Y. Zhang, X. Liang, D. Zhang, M. Tan, E. P. Xing, Unsupervised object-level video summarization with online motion auto-encoder, Pattern Recognit. Lett., 130 (2020), 376–385. doi: 10.1016/j.patrec.2018.07.030
![]() |
[17] | Y. Z. Zhang, Key frame extraction of surveillance video based on frequency domain analysis, Intell. Autom. Soft Comput., 258 (2021), 259–272. |
[18] | Y. Lu, Key frame extraction based on global motion statistics for team-sport videos, Mult. Syst., (2021), 1–15. |
[19] | H. Liu, H. Hao, Key frame extraction based on improved hierarchical clustering algorithm, International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), (2014). |
[20] |
J. Wu, S. H. Zhong, J. Jiang, Y. Yang, A novel clustering method for static video summarization, Mult. Tools Appl., 76 (2017), 9625–9641. doi: 10.1007/s11042-016-3569-x
![]() |
[21] |
H. Tang, H. Liu, W. Xiao, Fast and robust dynamic hand gesture recognition via key frames extraction and feature fusion, Neurocomputing, 331 (2019), 424–433. doi: 10.1016/j.neucom.2018.11.038
![]() |
[22] |
H. Zhou, A. H. Sadka, M. R. Swash, J. Azizi, U. A. Sadiq, Feature extraction and clustering for dynamic video summarization, Neurocomputing, 73 (2010), 1718–1729. doi: 10.1016/j.neucom.2009.09.022
![]() |
[23] |
K. Xiao, S. Lingfeng, G. Wenzhong, C. Dewang, Multi-dimensional traffic congestion detection based on fusion of visual features and convolutional neural network, IEEE Transact. Intell. Transp. Syst., 20 (2019), 2157–2170. doi: 10.1109/TITS.2018.2864612
![]() |
[24] |
T. Koch, G. Vazquez-Vilar, A rigorous approach to high-resolution entropy-constrained vector quantization, IEEE Trans. Inf. Theory, 64 (2018), 2609–2625. doi: 10.1109/TIT.2018.2803064
![]() |
[25] |
D. Liu, G. Hua, T. Chen, A hierarchical visual model for video object summarization, IEEE Transact. Pattern Anal. Mach. Intell., 32 (2010), 2178–2190. doi: 10.1109/TPAMI.2010.31
![]() |
[26] | C. Fahy, S. Yang, M. Gongora, Ant colony stream clustering: A fast density clustering algorithm for dynamic data streams, IEEE Transact. Cybernet., 49 (2018), 2215–2228. |
[27] |
J. Niu, D. Huo, K. Wang, C. Tong, Real-time generation of personalized home video summaries on mobile devices, Neurocomputing, 120 (2013), 404–414. doi: 10.1016/j.neucom.2012.06.056
![]() |
[28] |
C. Yang, L. Chuang, Y. Lin, Epistasis analysis using an improved Fuzzy C-means-based entropy approach, IEEE Trans. Fuzzy Syst., 28 (2020), 718–730. doi: 10.1109/TFUZZ.2019.2914629
![]() |
1. | Y.P. Ragini, Jeyanthi Palanivelu, R.V. Hemavathy, Critical analysis of enhanced microbial bioremediation strategies of PAHs contaminated sites: Toxicity and techno-economic analysis, 2024, 27, 2352801X, 101369, 10.1016/j.gsd.2024.101369 | |
2. | Iglė Vepštaitė‐Monstavičė, Juliana Lukša, Živilė Strazdaitė‐Žielienė, Saulius Serva, Elena Servienė, Distinct microbial communities associated with health‐relevant wild berries, 2024, 16, 1758-2229, 10.1111/1758-2229.70048 | |
3. | Rashmi Ranjan Mandal, Zahid Bashir, Deep Raj, Microbe-assisted phytoremediation for sustainable management of heavy metal in wastewater - A green approach to escalate the remediation of heavy metals, 2025, 375, 03014797, 124199, 10.1016/j.jenvman.2025.124199 | |
4. | Yuniar Harvianti, , The Potential of Immobilized Bacteria for Pollutant Bioremediation in The Environment: Systematic Review, 2025, 11, 2477-1392, 67, 10.24233/biov.11.1.2025.477 | |
5. | Patrick Othuke Akpoghelie, Great Iruoghene Edo, Alice Njolke Mafe, Endurance Fegor Isoje, Ufuoma Augustina Igbuku, Ali B. M. Ali, Emad Yousif, Joseph Oghenewogaga Owheruo, Splendour Oberhiri Oberhiri, Arthur Efeoghene Athan Essaghah, Dina S. Ahmed, Huzaifa Umar, Ahmed A. Alamiery, Food, Health, and Environmental Impact of Lactic Acid Bacteria: The Superbacteria for Posterity, 2025, 1867-1306, 10.1007/s12602-025-10546-x | |
6. | Alessia Tropea, Daniele Giuffrida, Cassamo U. Mussagy, Laurent Dufossé, Luigi Mondello, 2025, Chapter 6, 978-3-031-82177-6, 83, 10.1007/978-3-031-82178-3_6 | |
7. | Manswama Boro, Dixita Chettri, Sushruta Bhadra, Anil Kumar Verma, An insight into pollutants and their degradation by microbial enzymes, 2025, 1024-2422, 1, 10.1080/10242422.2025.2499004 |
Algorithm 1 Solving TSP using the ACO algorithm |
Input: Number of ants m, pheromone factor α, heuristic function factor β, pheromone volatilization coefficient ρ, and maximum number of iterations Output: An optimal solution 1: Initialize ant colony parameters and pheromone matrix 2: while not meet the stopping criteria do 3: Randomly place each ant at a different starting point 4: For each ant k (k=1,2,⋯,m), calculate the next vertex to be visited using the transition probability formula, until all ants have visited all the vertexes 5: Calculate the path length L passed by each ant, and save the optimal solution (shortest path) 6: Update the pheromone concentration on all paths 7: end while |
Algorithm 2 Crowding |
Input: Population NP Output: A set of optimal solutions 1: Randomly generate individuals of NP and a set of CF values 2: while not meet the stopping criteria do 3: Calculate the fitness of each individual 4: Generate candidate individuals through the basic operation of the algorithm 5: Randomly select CF individuals from the parents 6: Compare the candidate individual with the closest individual among CF individuals, and replace the closest individual if the candidate individual has better fitness 7: end while |
Algorithm 3 Speciation |
Input: Population and niching radius σ Output: A set of optimal solutions 1: Initialize population 2: Calculate the fitness of the population and sort them according to the fitness value 3: while population is not empty do 4: Find the best individual seed in the population 5: Find all individuals whose distance to seed is less than σ, and form a subpopulation with seed 6: Remove the subpopulation from the population 7: end while |
Algorithm 4 K-means algorithm for niching |
Input: Population NP and clustering number K Output: K species 1: Initialize population 2: Randomly select K individuals from the population as the centers 3: while not meet the stopping criteria do 4: Calculate the distance between other individuals in the population and these K centers 5: Assign other individuals to the nearest cluster according to distance 6: Recalculate and update the centers in the population 7: end while |
Algorithm 5 The multi-solution optimization algorithm with distancing |
Input: The number of ants, iteration bound, α, β, and ρ Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Construct a path through pseudo-random selection rules to obtain the first generation parent population 3: while not meet the stopping criteria do 4: Discard new individual that have a high similarity to the previous one, and form a new offspring population from the remaining individuals 5: Update the pheromone matrix 6: Calculate the fitness of the parent population together with the offspring population, and sort them 7: Select a set of good individuals from the queue head to form a new parent population 8: end while |
Algorithm 6 The CJD-MSO algorithm |
Input: The number of ants m, iteration bound, and the parameters of CJD-MSO (α, β and ρ) Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Randomly place each ant at a different starting point 3: For each ant k(k=1,2,…,m), calculate the next city to be visited according to the transition formula obtained by the roulette method, until all ants have visited all the cities 4: Obtain the first generation parent population through the 2-opt strategy 5: while not meet the stopping criteria do 6: Randomly place each ant at a different starting point 7: For each ant k(k=1,2,…,m), calculate the next city to be visited according to the transition probability formula obtained by the roulette method, until all ants have visited all the cities 8: For each new population, 2-opt algorithm is used to improve the quality of solutions 9: Calculating the circular Jaccard distance between the new individual path and latest individual path 10: if the similarity is higher than a preset value then 11: Save the new individual path 12: else 13: Discard the new individual path 14: end if 15: Obtain the offspring population, and update the pheromone matrix 16: Mix the parent population with the offspring population 17: Sort them and select some good individuals from the queue to form the next parent population 18: Save the optimal paths obtained so far 19: end while |
Name | City nums | Optimal solutions nums | Shortest path length | Name | City nums | Optimal solutions nums | Shortest path length |
MSTSP1 | 9 | 3 | 680 | MSTSP14 | 34 | 16 | 3575 |
MSTSP2 | 10 | 4 | 1265 | MSTSP15 | 22 | 72 | 9455 |
MSTSP3 | 10 | 13 | 832 | MSTSP16 | 33 | 64 | 8761 |
MSTSP4 | 11 | 4 | 803 | MSTSP17 | 35 | 10 | 9061 |
MSTSP5 | 12 | 2 | 754 | MSTSP18 | 39 | 20 | 23763 |
MSTSP6 | 12 | 4 | 845 | MSTSP19 | 42 | 20 | 14408 |
MSTSP7 | 10 | 56 | 130 | MSTSP20 | 45 | 20 | 10973 |
MSTSP8 | 12 | 110 | 1344 | MSTSP21 | 48 | 4 | 6767 |
MSTSP9 | 10 | 4 | 72 | MSTSP22 | 55 | 9 | 10442 |
MSTSP10 | 10 | 4 | 72 | MSTSP23 | 59 | 10 | 24451 |
MSTSP11 | 10 | 14 | 78 | MSTSP24 | 60 | 36 | 9614 |
MSTSP12 | 15 | 196 | 130 | MSTSP25 | 66 | 26 | 9521 |
MSTSP13 | 28 | 70 | 3055 |
NAME | PCC | DC | CES | JD | NAME | PCC | DC | CES | JD |
MSTSP1 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP14 | 0.813 | 0.905 | 0.766 | 0.905 |
MSTSP2 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP15 | 0.684 | 0.804 | 0.679 | 0.795 |
MSTSP3 | 0.835 | 0.835 | 0.993 | 0.835 | MSTSP16 | 0.629 | 0.748 | 0.586 | 0.760 |
MSTSP4 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP17 | 0.325 | 0.650 | 0.382 | 0.650 |
MSTSP5 | 1.000 | 1.000 | 0.907 | 1.000 | MSTSP18 | 0.000 | 0.000 | 0.224 | 0.000 |
MSTSP6 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.700 | 0.237 | 0.700 |
MSTSP7 | 0.751 | 0.690 | 0.879 | 0.690 | MSTSP20 | 0.160 | 0.130 | 0.172 | 0.520 |
MSTSP8 | 0.619 | 0.520 | 0.801 | 0.520 | MSTSP21 | 0.813 | 0.813 | 0.120 | 0.813 |
MSTSP9 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.000 | 0.000 |
MSTSP10 | 0.929 | 1.000 | 1.000 | 0.591 | MSTSP23 | 0.000 | 0.333 | 0.000 | 0.000 |
MSTSP11 | 0.941 | 0.982 | 0.967 | 0.983 | MSTSP24 | 0.188 | 0.207 | 0.118 | 0.313 |
MSTSP12 | 0.291 | 0.292 | 0.671 | 0.315 | MSTSP25 | 0.000 | 0.069 | 0.000 | 0.000 |
MSTSP13 | 0.743 | 0.775 | 0.743 | 0.950 | BRPS | 7 | 14 | 9 | 15 |
NAME | JD | CJD | NAME | JD | CJD |
MSTSP1 | 1.000 | 1.000 | MSTSP14 | 0.905 | 0.942 |
MSTSP2 | 1.000 | 1.000 | MSTSP15 | 0.795 | 0.823 |
MSTSP3 | 0.835 | 0.875 | MSTSP16 | 0.760 | 0.760 |
MSTSP4 | 1.000 | 1.000 | MSTSP17 | 0.650 | 0.710 |
MSTSP5 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.000 |
MSTSP6 | 1.000 | 1.000 | MSTSP19 | 0.700 | 0.642 |
MSTSP7 | 0.690 | 0.782 | MSTSP20 | 0.520 | 0.349 |
MSTSP8 | 0.520 | 0.752 | MSTSP21 | 0.813 | 0.813 |
MSTSP9 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 |
MSTSP10 | 0.591 | 1.000 | MSTSP23 | 0.000 | 0.175 |
MSTSP11 | 0.983 | 0.988 | MSTSP24 | 0.313 | 0.299 |
MSTSP12 | 0.315 | 0.547 | MSTSP25 | 0.000 | 0.062 |
MSTSP13 | 0.950 | 0.940 | BRPS | 12 | 19 |
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.684 | 0.973 | 1.000 | 1.000 | MSTSP14 | 0.087 | 0.172 | 0.883 | 0.942 |
MSTSP2 | 0.804 | 0.959 | 1.000 | 1.000 | MSTSP15 | 0.004 | 0.416 | 0.732 | 0.823 |
MSTSP3 | 0.497 | 0.936 | 1.000 | 0.875 | MSTSP16 | 0.000 | 0.054 | 0.554 | 0.760 |
MSTSP4 | 0.724 | 0.932 | 1.000 | 1.000 | MSTSP17 | 0.000 | 0.044 | 0.605 | 0.710 |
MSTSP5 | 0.989 | 0.846 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.031 | 0.571 | 0.000 |
MSTSP6 | 0.643 | 0.877 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.007 | 0.168 | 0.642 |
MSTSP7 | 0.125 | 0.769 | 0.923 | 0.782 | MSTSP20 | 0.000 | 0.000 | 0.165 | 0.349 |
MSTSP8 | 0.137 | 0.578 | 0.772 | 0.752 | MSTSP21 | 0.012 | 0.000 | 0.023 | 0.813 |
MSTSP9 | 0.768 | 0.974 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.013 | 0.000 |
MSTSP10 | 0.813 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.000 | 0.000 | 0.016 | 0.175 |
MSTSP11 | 0.459 | 0.949 | 1.000 | 0.988 | MSTSP24 | 0.000 | 0.000 | 0.010 | 0.299 |
MSTSP12 | 0.090 | 0.331 | 0.535 | 0.547 | MSTSP25 | 0.000 | 0.000 | 0.002 | 0.062 |
MSTSP13 | 0.025 | 0.096 | 0.611 | 0.940 | BRPS | 0 | 0 | 13 | 19 |
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.788 | 0.980 | 1.000 | 1.000 | MSTSP14 | 0.876 | 0.844 | 0.981 | 0.988 |
MSTSP2 | 0.894 | 0.972 | 1.000 | 1.000 | MSTSP15 | 0.744 | 0.847 | 0.932 | 0.902 |
MSTSP3 | 0.757 | 0.957 | 1.000 | 0.916 | MSTSP16 | 0.680 | 0.783 | 0.926 | 0.918 |
MSTSP4 | 0.809 | 0.947 | 1.000 | 1.000 | MSTSP17 | 0.765 | 0.803 | 0.944 | 0.881 |
MSTSP5 | 0.983 | 0.916 | 1.000 | 1.000 | MSTSP18 | 0.671 | 0.704 | 0.932 | 0.737 |
MSTSP6 | 0.843 | 0.943 | 1.000 | 1.000 | MSTSP19 | 0.675 | 0.699 | 0.865 | 0.862 |
MSTSP7 | 0.566 | 0.869 | 0.945 | 0.865 | MSTSP20 | 0.745 | 0.671 | 0.865 | 0.826 |
MSTSP8 | 0.652 | 0.838 | 0.898 | 0.838 | MSTSP21 | 0.773 | 0.628 | 0.834 | 0.870 |
MSTSP9 | 0.820 | 0.975 | 1.000 | 1.000 | MSTSP22 | 0.713 | 0.409 | 0.816 | 0.874 |
MSTSP10 | 0.850 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.671 | 0.344 | 0.829 | 0.839 |
MSTSP11 | 0.758 | 0.963 | 1.000 | 0.990 | MSTSP24 | 0.724 | 0.319 | 0.811 | 0.897 |
MSTSP12 | 0.732 | 0.809 | 0.871 | 0.857 | MSTSP25 | 0.725 | 0.270 | 0.747 | 0.859 |
MSTSP13 | 0.752 | 0.792 | 0.922 | 0.983 |
Algorithm 1 Solving TSP using the ACO algorithm |
Input: Number of ants m, pheromone factor α, heuristic function factor β, pheromone volatilization coefficient ρ, and maximum number of iterations Output: An optimal solution 1: Initialize ant colony parameters and pheromone matrix 2: while not meet the stopping criteria do 3: Randomly place each ant at a different starting point 4: For each ant k (k=1,2,⋯,m), calculate the next vertex to be visited using the transition probability formula, until all ants have visited all the vertexes 5: Calculate the path length L passed by each ant, and save the optimal solution (shortest path) 6: Update the pheromone concentration on all paths 7: end while |
Algorithm 2 Crowding |
Input: Population NP Output: A set of optimal solutions 1: Randomly generate individuals of NP and a set of CF values 2: while not meet the stopping criteria do 3: Calculate the fitness of each individual 4: Generate candidate individuals through the basic operation of the algorithm 5: Randomly select CF individuals from the parents 6: Compare the candidate individual with the closest individual among CF individuals, and replace the closest individual if the candidate individual has better fitness 7: end while |
Algorithm 3 Speciation |
Input: Population and niching radius σ Output: A set of optimal solutions 1: Initialize population 2: Calculate the fitness of the population and sort them according to the fitness value 3: while population is not empty do 4: Find the best individual seed in the population 5: Find all individuals whose distance to seed is less than σ, and form a subpopulation with seed 6: Remove the subpopulation from the population 7: end while |
Algorithm 4 K-means algorithm for niching |
Input: Population NP and clustering number K Output: K species 1: Initialize population 2: Randomly select K individuals from the population as the centers 3: while not meet the stopping criteria do 4: Calculate the distance between other individuals in the population and these K centers 5: Assign other individuals to the nearest cluster according to distance 6: Recalculate and update the centers in the population 7: end while |
Algorithm 5 The multi-solution optimization algorithm with distancing |
Input: The number of ants, iteration bound, α, β, and ρ Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Construct a path through pseudo-random selection rules to obtain the first generation parent population 3: while not meet the stopping criteria do 4: Discard new individual that have a high similarity to the previous one, and form a new offspring population from the remaining individuals 5: Update the pheromone matrix 6: Calculate the fitness of the parent population together with the offspring population, and sort them 7: Select a set of good individuals from the queue head to form a new parent population 8: end while |
Algorithm 6 The CJD-MSO algorithm |
Input: The number of ants m, iteration bound, and the parameters of CJD-MSO (α, β and ρ) Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Randomly place each ant at a different starting point 3: For each ant k(k=1,2,…,m), calculate the next city to be visited according to the transition formula obtained by the roulette method, until all ants have visited all the cities 4: Obtain the first generation parent population through the 2-opt strategy 5: while not meet the stopping criteria do 6: Randomly place each ant at a different starting point 7: For each ant k(k=1,2,…,m), calculate the next city to be visited according to the transition probability formula obtained by the roulette method, until all ants have visited all the cities 8: For each new population, 2-opt algorithm is used to improve the quality of solutions 9: Calculating the circular Jaccard distance between the new individual path and latest individual path 10: if the similarity is higher than a preset value then 11: Save the new individual path 12: else 13: Discard the new individual path 14: end if 15: Obtain the offspring population, and update the pheromone matrix 16: Mix the parent population with the offspring population 17: Sort them and select some good individuals from the queue to form the next parent population 18: Save the optimal paths obtained so far 19: end while |
Name | City nums | Optimal solutions nums | Shortest path length | Name | City nums | Optimal solutions nums | Shortest path length |
MSTSP1 | 9 | 3 | 680 | MSTSP14 | 34 | 16 | 3575 |
MSTSP2 | 10 | 4 | 1265 | MSTSP15 | 22 | 72 | 9455 |
MSTSP3 | 10 | 13 | 832 | MSTSP16 | 33 | 64 | 8761 |
MSTSP4 | 11 | 4 | 803 | MSTSP17 | 35 | 10 | 9061 |
MSTSP5 | 12 | 2 | 754 | MSTSP18 | 39 | 20 | 23763 |
MSTSP6 | 12 | 4 | 845 | MSTSP19 | 42 | 20 | 14408 |
MSTSP7 | 10 | 56 | 130 | MSTSP20 | 45 | 20 | 10973 |
MSTSP8 | 12 | 110 | 1344 | MSTSP21 | 48 | 4 | 6767 |
MSTSP9 | 10 | 4 | 72 | MSTSP22 | 55 | 9 | 10442 |
MSTSP10 | 10 | 4 | 72 | MSTSP23 | 59 | 10 | 24451 |
MSTSP11 | 10 | 14 | 78 | MSTSP24 | 60 | 36 | 9614 |
MSTSP12 | 15 | 196 | 130 | MSTSP25 | 66 | 26 | 9521 |
MSTSP13 | 28 | 70 | 3055 |
NAME | PCC | DC | CES | JD | NAME | PCC | DC | CES | JD |
MSTSP1 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP14 | 0.813 | 0.905 | 0.766 | 0.905 |
MSTSP2 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP15 | 0.684 | 0.804 | 0.679 | 0.795 |
MSTSP3 | 0.835 | 0.835 | 0.993 | 0.835 | MSTSP16 | 0.629 | 0.748 | 0.586 | 0.760 |
MSTSP4 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP17 | 0.325 | 0.650 | 0.382 | 0.650 |
MSTSP5 | 1.000 | 1.000 | 0.907 | 1.000 | MSTSP18 | 0.000 | 0.000 | 0.224 | 0.000 |
MSTSP6 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.700 | 0.237 | 0.700 |
MSTSP7 | 0.751 | 0.690 | 0.879 | 0.690 | MSTSP20 | 0.160 | 0.130 | 0.172 | 0.520 |
MSTSP8 | 0.619 | 0.520 | 0.801 | 0.520 | MSTSP21 | 0.813 | 0.813 | 0.120 | 0.813 |
MSTSP9 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.000 | 0.000 |
MSTSP10 | 0.929 | 1.000 | 1.000 | 0.591 | MSTSP23 | 0.000 | 0.333 | 0.000 | 0.000 |
MSTSP11 | 0.941 | 0.982 | 0.967 | 0.983 | MSTSP24 | 0.188 | 0.207 | 0.118 | 0.313 |
MSTSP12 | 0.291 | 0.292 | 0.671 | 0.315 | MSTSP25 | 0.000 | 0.069 | 0.000 | 0.000 |
MSTSP13 | 0.743 | 0.775 | 0.743 | 0.950 | BRPS | 7 | 14 | 9 | 15 |
NAME | JD | CJD | NAME | JD | CJD |
MSTSP1 | 1.000 | 1.000 | MSTSP14 | 0.905 | 0.942 |
MSTSP2 | 1.000 | 1.000 | MSTSP15 | 0.795 | 0.823 |
MSTSP3 | 0.835 | 0.875 | MSTSP16 | 0.760 | 0.760 |
MSTSP4 | 1.000 | 1.000 | MSTSP17 | 0.650 | 0.710 |
MSTSP5 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.000 |
MSTSP6 | 1.000 | 1.000 | MSTSP19 | 0.700 | 0.642 |
MSTSP7 | 0.690 | 0.782 | MSTSP20 | 0.520 | 0.349 |
MSTSP8 | 0.520 | 0.752 | MSTSP21 | 0.813 | 0.813 |
MSTSP9 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 |
MSTSP10 | 0.591 | 1.000 | MSTSP23 | 0.000 | 0.175 |
MSTSP11 | 0.983 | 0.988 | MSTSP24 | 0.313 | 0.299 |
MSTSP12 | 0.315 | 0.547 | MSTSP25 | 0.000 | 0.062 |
MSTSP13 | 0.950 | 0.940 | BRPS | 12 | 19 |
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.684 | 0.973 | 1.000 | 1.000 | MSTSP14 | 0.087 | 0.172 | 0.883 | 0.942 |
MSTSP2 | 0.804 | 0.959 | 1.000 | 1.000 | MSTSP15 | 0.004 | 0.416 | 0.732 | 0.823 |
MSTSP3 | 0.497 | 0.936 | 1.000 | 0.875 | MSTSP16 | 0.000 | 0.054 | 0.554 | 0.760 |
MSTSP4 | 0.724 | 0.932 | 1.000 | 1.000 | MSTSP17 | 0.000 | 0.044 | 0.605 | 0.710 |
MSTSP5 | 0.989 | 0.846 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.031 | 0.571 | 0.000 |
MSTSP6 | 0.643 | 0.877 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.007 | 0.168 | 0.642 |
MSTSP7 | 0.125 | 0.769 | 0.923 | 0.782 | MSTSP20 | 0.000 | 0.000 | 0.165 | 0.349 |
MSTSP8 | 0.137 | 0.578 | 0.772 | 0.752 | MSTSP21 | 0.012 | 0.000 | 0.023 | 0.813 |
MSTSP9 | 0.768 | 0.974 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.013 | 0.000 |
MSTSP10 | 0.813 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.000 | 0.000 | 0.016 | 0.175 |
MSTSP11 | 0.459 | 0.949 | 1.000 | 0.988 | MSTSP24 | 0.000 | 0.000 | 0.010 | 0.299 |
MSTSP12 | 0.090 | 0.331 | 0.535 | 0.547 | MSTSP25 | 0.000 | 0.000 | 0.002 | 0.062 |
MSTSP13 | 0.025 | 0.096 | 0.611 | 0.940 | BRPS | 0 | 0 | 13 | 19 |
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.788 | 0.980 | 1.000 | 1.000 | MSTSP14 | 0.876 | 0.844 | 0.981 | 0.988 |
MSTSP2 | 0.894 | 0.972 | 1.000 | 1.000 | MSTSP15 | 0.744 | 0.847 | 0.932 | 0.902 |
MSTSP3 | 0.757 | 0.957 | 1.000 | 0.916 | MSTSP16 | 0.680 | 0.783 | 0.926 | 0.918 |
MSTSP4 | 0.809 | 0.947 | 1.000 | 1.000 | MSTSP17 | 0.765 | 0.803 | 0.944 | 0.881 |
MSTSP5 | 0.983 | 0.916 | 1.000 | 1.000 | MSTSP18 | 0.671 | 0.704 | 0.932 | 0.737 |
MSTSP6 | 0.843 | 0.943 | 1.000 | 1.000 | MSTSP19 | 0.675 | 0.699 | 0.865 | 0.862 |
MSTSP7 | 0.566 | 0.869 | 0.945 | 0.865 | MSTSP20 | 0.745 | 0.671 | 0.865 | 0.826 |
MSTSP8 | 0.652 | 0.838 | 0.898 | 0.838 | MSTSP21 | 0.773 | 0.628 | 0.834 | 0.870 |
MSTSP9 | 0.820 | 0.975 | 1.000 | 1.000 | MSTSP22 | 0.713 | 0.409 | 0.816 | 0.874 |
MSTSP10 | 0.850 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.671 | 0.344 | 0.829 | 0.839 |
MSTSP11 | 0.758 | 0.963 | 1.000 | 0.990 | MSTSP24 | 0.724 | 0.319 | 0.811 | 0.897 |
MSTSP12 | 0.732 | 0.809 | 0.871 | 0.857 | MSTSP25 | 0.725 | 0.270 | 0.747 | 0.859 |
MSTSP13 | 0.752 | 0.792 | 0.922 | 0.983 |