
Citation: Damian G. D'Souza. The Parathyroid Hormone Family of Ligands and Receptors[J]. AIMS Medical Science, 2015, 2(3): 118-130. doi: 10.3934/medsci.2015.3.118
[1] | Divakar Dahiya, Hemant Sharma, Arun Kumar Rai, Poonam Singh Nigam . Application of biological systems and processes employing microbes and algae to Reduce, Recycle, Reuse (3Rs) for the sustainability of circular bioeconomy. AIMS Microbiology, 2022, 8(1): 83-102. doi: 10.3934/microbiol.2022008 |
[2] | Eman Koshlaf, Andrew S Ball . Soil bioremediation approaches for petroleum hydrocarbon polluted environments. AIMS Microbiology, 2017, 3(1): 25-49. doi: 10.3934/microbiol.2017.1.25 |
[3] | Pontsho Patricia Twala, Alfred Mitema, Cindy Baburam, Naser Aliye Feto . Breakthroughs in the discovery and use of different peroxidase isoforms of microbial origin. AIMS Microbiology, 2020, 6(3): 330-349. doi: 10.3934/microbiol.2020020 |
[4] | Yu Wang, Caroline A. Brown, Rachel Chen . Industrial production, application, microbial biosynthesis and degradation of furanic compound, hydroxymethylfurfural (HMF). AIMS Microbiology, 2018, 4(2): 261-273. doi: 10.3934/microbiol.2018.2.261 |
[5] | Nurul Alia Syufina Abu Bakar, Nur Aliyyah Khuzaini, Siti Baidurah . Co-fermentation involving Lysinibacillus sp. and Aspergillus flavus for simultaneous palm oil waste treatment and renewable biomass fuel production. AIMS Microbiology, 2022, 8(3): 357-371. doi: 10.3934/microbiol.2022025 |
[6] | Xiao Zhang, Jia Hua, Zule Song, Kejun Li . A review: Marine aquaculture impacts marine microbial communities. AIMS Microbiology, 2024, 10(2): 239-254. doi: 10.3934/microbiol.2024012 |
[7] | Cheng Zhen, Xian-Feng Ge, Yi-Ting Lu, Wen-Zheng Liu . Chemical structure, properties and potential applications of surfactin, as well as advanced strategies for improving its microbial production. AIMS Microbiology, 2023, 9(2): 195-217. doi: 10.3934/microbiol.2023012 |
[8] | Phu-Tho Nguyen, Tho-Thi Nguyen, Duc-Cuong Bui, Phuoc-Toan Hong, Quoc-Khanh Hoang, Huu-Thanh Nguyen . Exopolysaccharide production by lactic acid bacteria: the manipulation of environmental stresses for industrial applications. AIMS Microbiology, 2020, 6(4): 451-469. doi: 10.3934/microbiol.2020027 |
[9] | M. Azizur Rahman . Development of bioleaching: proteomics and genomics approach in metals extraction process. AIMS Microbiology, 2016, 2(3): 332-339. doi: 10.3934/microbiol.2016.3.332 |
[10] | Nilde Antonella Di Benedetto, Maria Rosaria Corbo, Daniela Campaniello, Mariagrazia Pia Cataldi, Antonio Bevilacqua, Milena Sinigaglia, Zina Flagella . The role of Plant Growth Promoting Bacteria in improving nitrogen use efficiency for sustainable crop production: a focus on wheat. AIMS Microbiology, 2017, 3(3): 413-434. doi: 10.3934/microbiol.2017.3.413 |
The traveling salesman problem (TSP) is one of the most intensively studied combinatorial optimization problems in the field of graph theory and operations research. It is believed that the TSP was first studied by Hamilton and Kirkman [1]. Since then, many and various algorithms and approximation algorithms have been proposed.
Given a set of nodes (cities) and the pairwise cost (or travel distance), the TSP is to find the best possible cycle of visiting all the cities and returning to the starting point that minimize the total travel cost. The general solution algorithm that guarantees the shortest path has exponential complexity, and no known efficient algorithm exists to date, since TSP is also an NP-hard problem.
The TSP naturally finds many applications in the field of transportation and logistics, such as arranging school bus routes to pick up the children [2], scheduling stacker cranes in warehouses [3], and delivering ordered meals to the customers [4]. Slightly modified, TSP can be applied to some problems which are unlike the path finding problems at first sight. For example, in the DNA sequencing problem [5], the node represents DNA fragments, and distance cost is defined as the similarity measure between DNA fragments.
Various methods have been proposed to solve TSP, such as violence enumeration, dynamic programming (DP) [6], and Hopfield neural network (HNN) [7]. Violence enumeration, as the simplest method, has the time complexity $ O(n!) $. DP decomposes the problem into couples of sub-problems which has much smaller time complexity as $ O(n^22^n) $ [6]. As an intelligent method, Hopfield neural network algorithm(HNN) [7] handles the permutation matrix which is mapped to the legal solution of TSP, and minimizes the energy function intelligently. The drawback of this method is that it may suffer from slow convergence and low accuracy when used to solve the TSP [8].
Using swarm intelligence [9,10] to address TSP has become a new trend in the past decades. Many metaheuristics has been reported excelling in solving TSP, such as genetic algorithm (GA) [11,12], particle swarm optimization (PSO) [13,14], and ant colony optimization (ACO) [15]. Ant colony optimization (ACO) is proposed by Marco Dorigo in 1992, which is inspired by the ant colony behavior of searching food.
ACO is one of the most frequently studied algorithm for solving TSP. Fevrier Valdez et al. [16] implemented the algorithms elitist ant system (EAS) and rank based ant system (ASrank); Gao Shupeng et al. [17] proposed a pheromone initialization strategy; Zhang Yuru et al. [18] developed an improved 2-opt and hybrid algorithm based on ACO; Dewantoro et al. [19] used tabu search algorithm as the local search in ACO.
In recent years, as researchers made slow progress in developing better methods for TSP, some researchers shifted their interests to some promoted problems, such as vehicle routing problem (VRP) and multiple traveling salesman problem (MTSP). MTSP means that multiple salesmen only visit a certain number of cities once and only once, and then return to their original cities with the smallest total cost. MTSP is closely related to some optimization problems such as task assignment [20].
Optimization problems with multiple solutions widely exist in the real world, and TSP is no exception. In the TSP, the salesman prefers couples of optimal solutions at hands in case that some solutions become unavailable for some reasons. Ting Huang et al. [21] proposed a benchmark to study the performance of algorithms for the multi-solution TSP (MSTSP), though few algorithms had been proposed at that time. NMA [22] integrates niche strategies with the memetic algorithm to address the MSTSP. NGA [21] groups related individuals and performs the GA operations including crossover and mutation in each group. NACS [23] is based on the ACO, in which ants are guided to search for different areas through multiple pheromone matrices.
Since ACO is the most successful metaheuristics reported to solve TSP, we also use the ACO in this paper as the base optimization method to address the MSTSP. We use 2-opt strategy to improve the local search ability. The 2-opt strategy provides an effective search operation, while requires less computation than 3-opt or more. We propose a new distance metric named "circular Jaccard distance" instead of other frequently used distance metrics, which brings a considerable improvement in the optimization performance. The circular Jaccard distance is defined as the minimal distances between two paths in various transformation modes such as flip and circular shift. We test the proposed method on the benchmark problems of 25 MSTSP instances, with two evaluation criteria designed for MSTSP. Experimental results indicate that the proposed method substantially improves the performance of multi-solution optimization.
The rest of the article is organized as follows. Section 2 introduces TSP, ACO, and how to solve TSP using ACO. In Section 3, we propose a novel niching strategy, and introduce the 2-opt algorithm and circular Jaccard distance to it, thus proposing the CJD-MSO algorithm. Section 4 describes MSTSP benchmarks, and evaluates the CJD-MSO with experiments. Section 5 concludes the paper finally.
In this section, we first introduce TSP and the ACO algorithm. Then we introduce how to solve the TSP using ACO algorithm. Finally, we introduce the basic multi-solution optimization algorithm for MSTSP.
Let $ G = (V, E) $ be a graph with set of vertexes $ V $ and set of edges $ E = \{(x, y)|x, y\in V\} $. Each edge $ e\in E $ is assigned a cost $ c_{e} $. If the graph is directed, then the cost of $ (x, y) $ is equal to the cost of $ (y, x) $ for all $ (x, y)\, {\in}\, E $. The equality does not hold for undirected graph.
Without loss of generality, we suppose graph $ G $ is a complete graph. Let the set of vertices be $ V = \{v_{1}, v_{2}, ..., v_{n}\} $. The cost matrix is given by $ C = (c_{ij})_{n\times n} $, where the cost of the edge from $ v_{i} $ to $ v_{j} $ is denoted $ c_{ij} $.
In the context of the traveling salesman problem, the vertexes correspond to cities and the edges correspond to the path between those cities. Let $ {H} $ be the set of all Hamiltonian cycles which visit each vertex exactly once except for the starting point, in $ G $. The traveling salesman problem (TSP) is to find a Hamiltonian cycle $ h\, {\in}\, {H} $ such that the total costs in the cycle is minimized.
The TSP may be formulated as an integer linear programming (ILP) problem [24]. Let
$ x_{ij} = {1the path goes from city i to city j0otherwise. $
|
For $ i = 0, 1, \cdots, n $, define $ u_i $ as an artificial variable. Then TSP can be rewritten as the following ILP problem:
$ min∑ni=0∑nj≠i,j=0cijxij0≤xij≤1i,j=0,⋯,nui∈Zi=0,⋯,n∑ni=0,i≠jxij=1j=0,⋯,n∑nj=0,j≠ixij=1i=0,⋯,nui−uj+nxij≤n−11≤i≠j≤n $
|
(2.1) |
The first set of equalities requires each city to be arrived from exact one other city. The second set of equalities requires each city to be a departure to exact one other city. The last constraints with the artificial variable $ u_i $ ensure that it is a Hamiltonian cycle. The ILP can be relaxed and solved as a linear programming (LP) using the simplex method. Other exact solution methods include the cutting plane method [25] and branch-and-cut [26].
The exact solution methods suffer from high computational complexity. Therefore, some metaheuristics such as ACO and GA were suggested for addressing the TSP. Ant colony optimization algorithm (ACO) is a swarm intelligence optimization algorithm, which is inspired by the foraging behavior of ants [15].
When the ants search for food, they wander around the nest randomly. If some ant finds food, it carries the food back to the nest and lays down pheromones along the road. Other ants nearby can perceive the pheromone and follow the path, and thus increasing the pheromone density. At the same time, the pheromone always evaporates. The more time it takes for an ant to travel along the path, the more time the pheromone evaporates. Hence the pheromone density becomes higher on the shorter path than the longer one. Other ants tend to take the short path and lay down new pheromones. Pheromones attracts more and more ants move on the shorter paths, and finally find the shortest path.
Figure 1 illustrates two scenarios about how the ants choose path through pheromones. Figure 1(a) contains two paths with the same length, it can be observed that most ants chose the upper path. The main reason is that once the upper path is first taken by some ant, it deposits pheromone on the path, which attracts other ants to choose the upper path. As the number of ants that choose the upper path increase, resulting in a higher pheromone density on the path than the lower one, more other ants tend to choose the upper path.
Figure 1(b) shows that most ants prefer the shorter path. Since the accumulation rate of pheromone on the short path is relatively high, the higher density of pheromone on the short path encourages more ants to choose the short path between the nest and the food.
ACO is a popular metaheuristic for addressing the TSP [27]. Next, we show how to use ACO to solve the TSP. At the beginning, $ m $ ants are randomly placed in $ n $ cities. The pheromone initial value on all paths between city and city are equal. The ant $ k $ ($ k = 1, 2, \cdots, m $) calculates the transition probability according to Eq (2.2), before choosing the next city.
$ Pk(i,j)={[τ(i,j)]α[η(i,j)]β∑u∈Jk(i)[τ(i,u)]α[η(i,u]βj∈Jk(i)0otherwise, $
|
(2.2) |
where $ P_k(i, j) $ is the transition possibility that ant $ k $ moving from city $ i $ to city $ j $; $ \alpha $ is used to control the concentration of pheromone; $ \beta $ is used to control the role of heuristic information; $ \tau(i, j) $ is pheromones on edges ($ i $, $ j $); $ \eta(i, j) $ is the heuristic factor for moving from city $ i $ to city $ j $; $ J_{k}(i) $ is a collection of candidate cities which ant $ k $ is allowed to visit next.
The $ R^k $ is used to record the vertexes which the ant $ k $ has visited. When $ m $ ants have completed the tour and go back to the departure vertex, the total cost of the $ n $ edges traveled by each ant is calculated. Then, the minimum cost is saved, and the pheromone on each edge is updated. During the iteration, the amount of pheromone remaining on the path after evaporating is calculated as follows:
$ τ(i,j)=(1−ρ)⋅τ(i,j)+m∑k=1Δτk(i,j), $
|
(2.3) |
where $ \rho $, between 0 and 1, is pheromone volatilization coefficient. $ \Delta\tau_k(i, j) $ is the pheromones released by the $ k $ ant during this tour, which is defined as:
$ Δτk(i,j)={(Ck)−1(i,j)∈Rk0otherwise $
|
(2.4) |
where $ C_k $ represents the path length of the ant $ k $.
Algorithm1 presents the pseudocode.
Algorithm 1 Solving TSP using the ACO algorithm |
Input: Number of ants $ m $, pheromone factor $ \alpha $, heuristic function factor $ \beta $, pheromone volatilization coefficient $ \rho $, and maximum number of iterations Output: An optimal solution 1: Initialize ant colony parameters and pheromone matrix 2: while not meet the stopping criteria do 3: Randomly place each ant at a different starting point 4: For each ant $ k $ ($ k = 1, 2, \cdots, m $), calculate the next vertex to be visited using the transition probability formula, until all ants have visited all the vertexes 5: Calculate the path length $ L $ passed by each ant, and save the optimal solution (shortest path) 6: Update the pheromone concentration on all paths 7: end while |
Most swarm intelligence optimization algorithms are developed to solve the single-solution optimization problems. It is necessary to introduce some strategies to make these algorithms suitable for solving multi-solution optimization problems.
Niching method is one of the most widely used strategies. Niching method is a category of techniques to prevent population convergence to a single optimum by maintaining multiple niches. It should be mentioned that some improved ACO versions and various other metaheuristics may have good mechanism for exploration. Although good exploration ability promotes diversity of the solution, it is preferable to use the niching method for maintaining multiple niches.
The classic niching methods were developed in the early 70s such as crowding method [28]. After that, various niching methods were developed including fitness sharing [29], clearing [30], RTS [31], speciation [32] and clustering [33,34].
Fitness sharing [29] is inspired by the sharing phenomenon in nature. The resource in a niching is limited, so individuals have to share their resources with other individuals occupying the same niching. When there are too many individuals in a niche, the fitness of the individual will be reduced, thus leading to reduce the number of individuals. Or, conversely, a small number of individuals in a niche may increase the population size. Fitness is adjusted by the population size in a niche, thus facilitating finding multiple optimal solutions.
Crowding [28] is a technique to maintain diversity during the course of iteration through careful population replacement. First, a number of individuals are selected from the parent generation. The number of individuals are determined by a crowding factor (CF). Then, the new individuals most similar to parent individuals are selected to replace parent individuals. Since the population is initialized with evenly distribution, the evenly distribution can be maintained during the course of iteration. Algorithm 2 depicts the procedure of crowding.
Algorithm 2 Crowding |
Input: Population $ NP $ Output: A set of optimal solutions 1: Randomly generate individuals of $ NP $ and a set of $ CF $ values 2: while not meet the stopping criteria do 3: Calculate the fitness of each individual 4: Generate candidate individuals through the basic operation of the algorithm 5: Randomly select $ CF $ individuals from the parents 6: Compare the candidate individual with the closest individual among $ CF $ individuals, and replace the closest individual if the candidate individual has better fitness 7: end while |
Clearing [30] is a strategy easy to implement. In each iteration, the current optimal individual in the population is selected as the center. Then, all the individuals within the specified clearance radius are removed from the population. The procedure repeats until the original population has no individual, thus obtaining a set of optimal solutions finally.
The principle of speciation is to divide the population into several niches, and the populations in each niche evolve parallelly. Algorithm 3 presents the pseudocode of speciation.
Algorithm 3 Speciation |
Input: Population and niching radius $ \sigma $ Output: A set of optimal solutions 1: Initialize population 2: Calculate the fitness of the population and sort them according to the fitness value 3: while population is not empty do 4: Find the best individual $ seed $ in the population 5: Find all individuals whose distance to $ seed $ is less than $ \sigma $, and form a subpopulation with $ seed $ 6: Remove the subpopulation from the population 7: end while |
The goal of clustering [33] is to group individuals into clusters such that individuals in each cluster have greater similarity than inter-clusters. $ K $-means algorithm is a typical clustering algorithm used in niching method, where $ K $ represents the number of clusters [34]. Algorithm 4 is the pseudocode of $ K $-means algorithm for niching.
Algorithm 4 $ K $-means algorithm for niching |
Input: Population $ NP $ and clustering number $ K $ Output: $ K $ species 1: Initialize population 2: Randomly select $ K $ individuals from the population as the centers 3: while not meet the stopping criteria do 4: Calculate the distance between other individuals in the population and these $ K $ centers 5: Assign other individuals to the nearest cluster according to distance 6: Recalculate and update the centers in the population 7: end while |
The similarity metric plays an essential role in multi-solution optimization. As shown in Algorithm 5, whether the new individual is discarded or not is up to the similarity between the new path and the old one. Next, we introduce couples of similarity metrics which may be used in the multi-solution optimization.
Algorithm 5 The multi-solution optimization algorithm with distancing |
Input: The number of ants, iteration bound, $ \alpha $, $ \beta $, and $ \rho $ Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Construct a path through pseudo-random selection rules to obtain the first generation parent population 3: while not meet the stopping criteria do 4: Discard new individual that have a high similarity to the previous one, and form a new offspring population from the remaining individuals 5: Update the pheromone matrix 6: Calculate the fitness of the parent population together with the offspring population, and sort them 7: Select a set of good individuals from the queue head to form a new parent population 8: end while |
Pearson correlation coefficient (PCC) [35], or the bivariate correlation, is a widely used measure of the similarity between vectors. If there are two variables: $ X $ and $ Y $, PCC can be formulated as follows:
$ ρ(X,Y)=E[(X−μX)(Y−μY)]√∑ni=1(Xi−μX)2√∑ni=1(Yi−μY)2 $
|
(2.5) |
where $ E $ is the mathematical expectation; $ \mu_X $ is the expectation of the $ n $-dimensional vector $ X $; $ \mu_Y $ is the expectation of $ Y $.
The greater the absolute value of the correlation coefficient is, the stronger correlation they have. The correlation with PCC 0 is the weakest correlation.
Distance correlation coefficient (DCC) [36], or distance covariance, is a measure of dependence between two arbitrary vectors, not necessarily equal in dimension. The DCC is zero if and only if the vectors are independent. The formula for DCC is as follows:
$ ρXY=E[(X−E(X))(Y−E(Y))]√D(X)√D(Y), $
|
(2.6) |
where $ E $ is the mathematical expectation, $ D $ is the variance. The similarity between $ X $ and $ Y $ is closer if the DCC is greater.
Distance correlation (DC) defined in [36] is the complement set of DCC. It can be formulated as follows:
$ DXY=1−ρXY $
|
(2.7) |
Therefore, the similarity between $ X $ and $ Y $ is closer if their DC is smaller.
Two vectors with more common edges should have stronger similarity. Suppose $ P_1 $ and $ P_2 $ represent two vectors, the common edge similarity(CES) is defined as follows:
$ similarity(P1,P2)=sharededgesnumberofcities $
|
(2.8) |
The CES similarity is between 0 and 1. Only when the similarity equals to 1, does it mean that the two vectors are identical.
The Jaccard index [37], or the Jaccard similarity coefficient, is widely used for comparing the similarity or difference between limited sample sets. Given two sets $ A $ and $ B $, the Jaccard coefficient is defined as the ratio of the intersection to the union of $ A $ and $ B $, namely,
$ J(A,B)=|A∩B||A∪B|=|A∩B||A|+|B|+|A∩B|. $
|
(2.9) |
The Jaccard index has value between 0 and 1. Specifically, when both $ A $ and $ B $ are empty sets, their Jaccard index is defined as 1.
Jaccard distance [38] is also a metric used to measure the difference between two sets. It is the complement index of Jaccard similarity coefficient. The formula of the Jaccard distance is as follows:
$ dj(A+B)=1−J(A,B)=AΔB|A∪B|, $
|
(2.10) |
where the symmetric difference $ A \Delta B = |A \cup B| - |A \cap B| $.
When the Jaccard distance is used to measure the similarity, the result of the completely identical calculation equals to 0, and the result of the completely different calculation equals to 1.
In this section, we first propose a novel niching strategy for multi-solution optimization. Next, we propose a circular Jaccard distance as the similarity metric which plays an important role in our algorithm. Then, we introduce the 2-opt strategy which is used as the local search method. Finally, we summarize the whole multi-solution optimization algorithm.
We develop a novel niching method based on the "radius" strategy which is used in the clearing strategy. After initializing a population of path randomly, we select a set of paths from the population with a given ratio, which is regarded as the parent population. In each iteration, a batch of new individual is generated following the rule of the base metaheuristics. For each new individual, if it has a high similarity to the previous one, then the new individual will be discarded. Otherwise, the new individual is saved in the population pool as the offspring candidate. We calculate the fitness of the parent population together with the offspring population, and sort them in ascending order. Then, a number of good individuals are chosen from the queue head to form a new parent population. The strategy requires that the individual in the next generation keeps a certain distance from the current generation. As shown in Figure 2, let 1, 2 and 3 (or 3' or 3") be the possible positions of an individual in the 1st, 2nd and 3rd generation. 2 is a step from 1. And then 3 (or 3' or 3") is a step from 2 $ \cdots\cdots $. With high probability, the distance from 3 (or 3' or 3") to 1 is greater than the distance between 2 to 1. Hence it maintains diversity during the course of search.
The procedure repeats until the stopping criteria are met and the optimal set of individuals is obtained finally. The proposed niching method is different from any other niching method proposed previously, such as crowding, fitness sharing, and clearing. And it is easy to implement compared with clearing. We name it as "distancing".
Algorithm 5 gives the pseudocode of the distancing method we proposed. In Algorithm 5, ACO is the base metaheuristics.
The similarity metrics mentioned previously, such as Jaccard distance, are used to promote diversity of the population during the course of iteration, which is also the key measure of quality of the multiple solutions. The valid path in the TSP is always a Hamiltonian cycle. If we use the similarity metrics for vectors in MSTSP, then the property of cycle is not explored and utilized fully.
We take an example to reveal the limitation of Jaccard distance. Suppose there are three paths: "0-6-5-3-7-8-2-4-1" (path 1), "0-1-4-2-8-7-3-5-6" (path 2) and "2-8-7-3-5-6-0-1-4" (path 3). The Jaccard distance between path 1 and path 2 equals to 1. The Jaccard distance between path 2 and path 3 also equals to 1. The Jaccard distance between path 1 and path 3 also equals to 0.889. However, the path 1, 2 and 3 are all the same for the TSP. Therefore, we propose a novel metric named Circular Jaccard Distance (CJD) in this paper to overcome the limitation.
The CJD respects the invariance of the path, which means the paths transformed by cycling or flipping are regarded the same as the original path. When calculating the CJD, first, one of the two paths is transformed by cycling and flipping. Then the Jaccard distance between each pairs are calculated. Finally, the CJD is obtained by minimizing these JDs.
For example, suppose the original path is "0-1-4-2-8-7-3-5-6". The new generated path is the path "6-5-3-8-2-4-1-0-7". By transforming the second path, one can obtain "6-5-3-8-2-4-1-0-7", "5-3-8-2-4-1-0-7-6", "3-8-2-4-1-0-7-6-5", "8-2-4-1-0-7-6-5-3", "2-4-1-0-7-6-5-3-8", "4-1-0-7-6-5-3-8-2", "1-0-7-6-5-3-8-2-4", "0-7-6-5-3-8-2-4-1", "7-6-5-3-8-2-4-1-0", "6-7-0-1-4-2-8-3-5", "5-6-7-0-1-4-2-8-3", "3-5-6-7-0-1-4-2-8", "8-3-5-6-7-0-1-4-2", "2-8-3-5-6-7-0-1-4", "4-2-8-3-5-6-7-0-1", "1-4-2-8-3-5-6-7-0", "0-1-4-2-8-3-5-6-7" and "7-0-1-4-2-8-3-5-6". As shown in Figure 3, Figure 3(a) is the original path. Figure 3(b) is the path obtained by clockwise or counter-clockwise rotation. Figure 3(c) is the path obtained by flipping. The CJDs between the converted paths and the original path are 1, 0.78, 1, 0.67, 1, 0.78, 1, 1, 0.89, 1, 1, 1, 1, 0.89, 1, 1, 0.5 and 0.67 respectively.
In optimization, $ n $-opt is a simple local search method for solving combinatorial optimization problems [39]. The $ n $-opt treatment involves deleting $ n $ edges in a tour, reconstructing the path in some other possible ways.
Take the 2-opt as example, as shown in Figure 4, Figure 4(a) is the original path. In every 2-opt strategy, two edges of the solution are deleted and two new edges are added to obtain a new path. If the new path is better, the original path is replaced. Specifically, since $ d(V_1, V_3) + d(V_2, V_4) > d(V_1, V_2) + d(V_3, V_4) $, the edges $ {(V_1, V_3)} $ and $ {(V_2, V_4)} $ are removed, and the edges $ {(V_1, V_2)} $ and $ {(V_3, V_4)} $ are added instead, thus forming a new path, see Figure4(a). Here $ d $ represents the distance or similarity. Again, in Figure 4(c), since $ d(V_1, V_7) + d(V_4, V_5) > d (V_1, V_5) + d (V_4, V_7) $, the edges $ {(V_1, V_7)} $ and $ {(V_4, V_5)} $ are removed, and the edges $ {(V_1, V_5)} $ and $ {(V_4, V_7)} $ are added instead, see Figure 4(b).
The 3-opt strategy is able to create more choices of new paths. However, the computational complexity becomes $ O(n^3) $ which is not affordable for most applications. Therefore, we use 2-opt as the main local search method in this paper.
In this paper, we propose a circular Jaccard distance based multi-solution optimization (CJD-MSO) algorithm to improve the performance for solving the traveling salesman problem.
First, we initialize pheromone matrix, randomly place the ants to the initial cities, calculate the next city according to the transition probabilities, and thus obtain the first generation parent population. In the iterative process, the 2-opt algorithm is used to improve the quality of the solution and the distancing strategy is used to maintain diversity. The circular Jaccard distance between each new individual with the last individual path is calculated. If the similarity value is higher than a preset value, which means that two paths are significantly different, then it is retained. Otherwise, the new individual is discarded. The parent population and offspring population are mixed together, and a set of good individuals with higher fitness is chosen to be the next generation of parent population. Algo.6 shows the entire process of CJD-MSO.
Algorithm 6 The CJD-MSO algorithm |
Input: The number of ants $ m $, iteration bound, and the parameters of CJD-MSO ($ \alpha $, $ \beta $ and $ \rho $) Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Randomly place each ant at a different starting point 3: For each ant $ k (k = 1, 2, …, m) $, calculate the next city to be visited according to the transition formula obtained by the roulette method, until all ants have visited all the cities 4: Obtain the first generation parent population through the 2-opt strategy 5: while not meet the stopping criteria do 6: Randomly place each ant at a different starting point 7: For each ant $ k (k = 1, 2, …, m) $, calculate the next city to be visited according to the transition probability formula obtained by the roulette method, until all ants have visited all the cities 8: For each new population, 2-opt algorithm is used to improve the quality of solutions 9: Calculating the circular Jaccard distance between the new individual path and latest individual path 10: if the similarity is higher than a preset value then 11: Save the new individual path 12: else 13: Discard the new individual path 14: end if 15: Obtain the offspring population, and update the pheromone matrix 16: Mix the parent population with the offspring population 17: Sort them and select some good individuals from the queue to form the next parent population 18: Save the optimal paths obtained so far 19: end while |
In this section, first we describe the benchmark TSPs on which we shall test our proposed algorithm. Next, we introduce the evaluation criteria we shall use. Then, we exhibit our experimental results including the performance of CJD-MSO and some comparison details.
Ting Huang [21] proposed a benchmark set for MSTSPs in 2018, which contains 25 MSTSP instances, as shown in Table 1. In the benchmark set, MSTSPs are classified into three categories: simple MSTSPs (from MSTSP1 to MSTSP6), geometry MSTSPs (from MSTSP7 to MSTSP12), and composite MSTSPs (from MSTSP13 to MSTSP25). The number of cities in the MSTSPs ranges from 9 to 66. And the number of optimal scales ranges from 2 to 196. We test our algorithm on the benchmark set.
Name | City nums | Optimal solutions nums | Shortest path length | Name | City nums | Optimal solutions nums | Shortest path length |
MSTSP1 | 9 | 3 | 680 | MSTSP14 | 34 | 16 | 3575 |
MSTSP2 | 10 | 4 | 1265 | MSTSP15 | 22 | 72 | 9455 |
MSTSP3 | 10 | 13 | 832 | MSTSP16 | 33 | 64 | 8761 |
MSTSP4 | 11 | 4 | 803 | MSTSP17 | 35 | 10 | 9061 |
MSTSP5 | 12 | 2 | 754 | MSTSP18 | 39 | 20 | 23763 |
MSTSP6 | 12 | 4 | 845 | MSTSP19 | 42 | 20 | 14408 |
MSTSP7 | 10 | 56 | 130 | MSTSP20 | 45 | 20 | 10973 |
MSTSP8 | 12 | 110 | 1344 | MSTSP21 | 48 | 4 | 6767 |
MSTSP9 | 10 | 4 | 72 | MSTSP22 | 55 | 9 | 10442 |
MSTSP10 | 10 | 4 | 72 | MSTSP23 | 59 | 10 | 24451 |
MSTSP11 | 10 | 14 | 78 | MSTSP24 | 60 | 36 | 9614 |
MSTSP12 | 15 | 196 | 130 | MSTSP25 | 66 | 26 | 9521 |
MSTSP13 | 28 | 70 | 3055 |
We use two evaluation criteria to test our work: $ F_\beta $ and the diversity indicator (DI) [21].
$ F_\beta $ is an indicator calculated from the precision value $ P $ and the recall value $ R $. The formula is as follows:
$ Fβ=(1+β)2⋅P⋅Rβ2⋅P+R, $
|
(4.1) |
where $ P $ represents the proportion of optimal solutions, and $ R $ represents the proportion of the found optimal solutions to known optimal solutions. When $ \beta $ is set to 1, $ P $ and $ R $ have the same importance. We set $ \beta $ as 0.3 to attach more importance to the precision.
The formula of $ P $ and $ R $ are as follows:
$ P=TPTP+FP, $
|
(4.2) |
$ R=TPTP+FN. $
|
(4.3) |
The true positive, $ TP $, represents the number of optimal solutions obtained by the algorithm; The false positive, $ FP $, represents the number of non-optimal solutions obtained by the algorithm; The false negative, $ FN $, represents the optimal solutions that the algorithm did not find.
The $ P $, $ R $, and $ F_\beta $ are real values between 0 and 1. If the solutions obtained by the algorithm are all optimal solutions, then $ P $ equals to 1. If the algorithm finds all known optimal solutions, then $ R $ equals to 1. $ F_\beta $ is 1, if the values of $ P $ and $ R $ both are 1. Larger $ F_\beta $ means better performance.
The $ DI $ helps to assess the diversity performance of the multiple solutions. $ DI $ is defined by the average maximum similarity between the obtained solutions and the ground-truth solutions. The formula of $ DI $ is as follows:
$ DI(P,S)=∑|P|i=1maxj=1,...,|S|S(Pi,Sj)|P|, $
|
(4.4) |
where $ P $ represents the known optimal solution set; $ S $ represents the solution set obtained by the algorithm; $ P_i $ represents the $ i $-th solution in the set $ P $; $ S_j $represents the $ j $-th solution in the set $ S $; and $ S(P_i, S_j) $ represents the similarity. Larger $ DI $ means that more solutions have been found, given the known optimal solutions.
In this section, we conduct a series of numerical experiments to evaluate the proposed algorithm. Firstly, we test the performance of ACO in finding single optimal solution for the MSTSP. Secondly, we compare the algorithmic performance with various similarity metrics including PCC, DC, CES and JD. Thirdly, we compare the algorithmic performance with circular Jaccard distance to that with JD. Finally, we compare CJD-MSO with NACS [23], NGA [21], NMA [22], and give the values of $ F_\beta $ and $ DI $.
The parameter settings are as follows: $ \alpha = 1.0, \beta = 2.0, \rho = 0.5 $. The threshold value of the CJD is set to 0.4. All algorithms are implemented in Python language and run on the Mac OS Catalina system.
We use the MSTSP instances to test the ACO algorithm. Although the MSTSP instances are designed for testing multi-solution optimization algorithms, it is also eligible for testing the single solution optimization algorithms.
Figure 5 presents the convergence curves of ACO. For simple and geometry MSTSPs, the ACO converges to the optimal solution rapidly. The ACO without niching method can find only one optimum in a single run. For some composite MSTSPs such as Figure 5(c), the ACO cannot find any optimum within 3000 iterations, which indicates that composite MSTSPs are really hard problems.
Next, we improve the original ACO by adding "distancing" strategy. This time, we find couples of different solutions for MSTSPs including the simple MSTSPs, the geometry MSTSPs, and the composite MSTSPs, as shown in Figure 6.
We compare the performance of $ F_\beta $ and $ DI $ on MSTSP among various similarity metrics including PCC, DC, CES and JD.
As shown in Figure 7 and Table 2, the four similarity metrics work well in the simple MSTSPs. For geometry MSTSPs and composite MSTSPs, JD considerably outperforms the other three methods.
NAME | PCC | DC | CES | JD | NAME | PCC | DC | CES | JD |
MSTSP1 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP14 | 0.813 | 0.905 | 0.766 | 0.905 |
MSTSP2 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP15 | 0.684 | 0.804 | 0.679 | 0.795 |
MSTSP3 | 0.835 | 0.835 | 0.993 | 0.835 | MSTSP16 | 0.629 | 0.748 | 0.586 | 0.760 |
MSTSP4 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP17 | 0.325 | 0.650 | 0.382 | 0.650 |
MSTSP5 | 1.000 | 1.000 | 0.907 | 1.000 | MSTSP18 | 0.000 | 0.000 | 0.224 | 0.000 |
MSTSP6 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.700 | 0.237 | 0.700 |
MSTSP7 | 0.751 | 0.690 | 0.879 | 0.690 | MSTSP20 | 0.160 | 0.130 | 0.172 | 0.520 |
MSTSP8 | 0.619 | 0.520 | 0.801 | 0.520 | MSTSP21 | 0.813 | 0.813 | 0.120 | 0.813 |
MSTSP9 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.000 | 0.000 |
MSTSP10 | 0.929 | 1.000 | 1.000 | 0.591 | MSTSP23 | 0.000 | 0.333 | 0.000 | 0.000 |
MSTSP11 | 0.941 | 0.982 | 0.967 | 0.983 | MSTSP24 | 0.188 | 0.207 | 0.118 | 0.313 |
MSTSP12 | 0.291 | 0.292 | 0.671 | 0.315 | MSTSP25 | 0.000 | 0.069 | 0.000 | 0.000 |
MSTSP13 | 0.743 | 0.775 | 0.743 | 0.950 | BRPS | 7 | 14 | 9 | 15 |
For MSTSPs with more optimal solutions or more cities, such as MSTSP18 and MSTSP22, since the $ F_\beta $s almost drop to 0 for all similarity metrics. We compare their $ DI $s in Figure 8.
From Figure 8, one can observe that JD considerably outperforms PCC and DC, and is slightly better than CES. So, we believe that JD is a good similarity metric for solving the MSTSPs. Next, we compare the performance of CJD with that of JD. Table 3 gives $ F_\beta $ value of the algorithms with CJD and CD.
NAME | JD | CJD | NAME | JD | CJD |
MSTSP1 | 1.000 | 1.000 | MSTSP14 | 0.905 | 0.942 |
MSTSP2 | 1.000 | 1.000 | MSTSP15 | 0.795 | 0.823 |
MSTSP3 | 0.835 | 0.875 | MSTSP16 | 0.760 | 0.760 |
MSTSP4 | 1.000 | 1.000 | MSTSP17 | 0.650 | 0.710 |
MSTSP5 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.000 |
MSTSP6 | 1.000 | 1.000 | MSTSP19 | 0.700 | 0.642 |
MSTSP7 | 0.690 | 0.782 | MSTSP20 | 0.520 | 0.349 |
MSTSP8 | 0.520 | 0.752 | MSTSP21 | 0.813 | 0.813 |
MSTSP9 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 |
MSTSP10 | 0.591 | 1.000 | MSTSP23 | 0.000 | 0.175 |
MSTSP11 | 0.983 | 0.988 | MSTSP24 | 0.313 | 0.299 |
MSTSP12 | 0.315 | 0.547 | MSTSP25 | 0.000 | 0.062 |
MSTSP13 | 0.950 | 0.940 | BRPS | 12 | 19 |
From Figure 9 and Table 3, we can observe that CJD significantly outperforms JD, indicating that CJD facilitates maintaining the diversity of the population.
Finally, we perform experiment to compare CJD-MSO with NGA [21], NMA [22] and NACS [23]. Table 4 compares the $ F_\beta $ value of these algorithms. The best $ F_\beta $ for each MSTSP instance is marked in bold. BRPS represents the total of bold items.
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.684 | 0.973 | 1.000 | 1.000 | MSTSP14 | 0.087 | 0.172 | 0.883 | 0.942 |
MSTSP2 | 0.804 | 0.959 | 1.000 | 1.000 | MSTSP15 | 0.004 | 0.416 | 0.732 | 0.823 |
MSTSP3 | 0.497 | 0.936 | 1.000 | 0.875 | MSTSP16 | 0.000 | 0.054 | 0.554 | 0.760 |
MSTSP4 | 0.724 | 0.932 | 1.000 | 1.000 | MSTSP17 | 0.000 | 0.044 | 0.605 | 0.710 |
MSTSP5 | 0.989 | 0.846 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.031 | 0.571 | 0.000 |
MSTSP6 | 0.643 | 0.877 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.007 | 0.168 | 0.642 |
MSTSP7 | 0.125 | 0.769 | 0.923 | 0.782 | MSTSP20 | 0.000 | 0.000 | 0.165 | 0.349 |
MSTSP8 | 0.137 | 0.578 | 0.772 | 0.752 | MSTSP21 | 0.012 | 0.000 | 0.023 | 0.813 |
MSTSP9 | 0.768 | 0.974 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.013 | 0.000 |
MSTSP10 | 0.813 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.000 | 0.000 | 0.016 | 0.175 |
MSTSP11 | 0.459 | 0.949 | 1.000 | 0.988 | MSTSP24 | 0.000 | 0.000 | 0.010 | 0.299 |
MSTSP12 | 0.090 | 0.331 | 0.535 | 0.547 | MSTSP25 | 0.000 | 0.000 | 0.002 | 0.062 |
MSTSP13 | 0.025 | 0.096 | 0.611 | 0.940 | BRPS | 0 | 0 | 13 | 19 |
We use the pseudo-color plot (Figure 10) to exhibit the comparison results. Only eight "+" exists, which indicates that CJD-MSO perform very well compared with other multi-solution optimization algorithms.
We calculate the $ DI $ of CJD-MSO, and compare with other algorithms in Table 5 and Figure 11. For simple MSTSPs and geometry MSTSPs, the DIs of CJD-MSO are stably good. Seven MSTSPs have a DI value of 1, and the left MSTSPs are approximately 1. From MSTSP19 to MSTSP25, CJD-MSO significantly outperforms other algorithms too.
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.788 | 0.980 | 1.000 | 1.000 | MSTSP14 | 0.876 | 0.844 | 0.981 | 0.988 |
MSTSP2 | 0.894 | 0.972 | 1.000 | 1.000 | MSTSP15 | 0.744 | 0.847 | 0.932 | 0.902 |
MSTSP3 | 0.757 | 0.957 | 1.000 | 0.916 | MSTSP16 | 0.680 | 0.783 | 0.926 | 0.918 |
MSTSP4 | 0.809 | 0.947 | 1.000 | 1.000 | MSTSP17 | 0.765 | 0.803 | 0.944 | 0.881 |
MSTSP5 | 0.983 | 0.916 | 1.000 | 1.000 | MSTSP18 | 0.671 | 0.704 | 0.932 | 0.737 |
MSTSP6 | 0.843 | 0.943 | 1.000 | 1.000 | MSTSP19 | 0.675 | 0.699 | 0.865 | 0.862 |
MSTSP7 | 0.566 | 0.869 | 0.945 | 0.865 | MSTSP20 | 0.745 | 0.671 | 0.865 | 0.826 |
MSTSP8 | 0.652 | 0.838 | 0.898 | 0.838 | MSTSP21 | 0.773 | 0.628 | 0.834 | 0.870 |
MSTSP9 | 0.820 | 0.975 | 1.000 | 1.000 | MSTSP22 | 0.713 | 0.409 | 0.816 | 0.874 |
MSTSP10 | 0.850 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.671 | 0.344 | 0.829 | 0.839 |
MSTSP11 | 0.758 | 0.963 | 1.000 | 0.990 | MSTSP24 | 0.724 | 0.319 | 0.811 | 0.897 |
MSTSP12 | 0.732 | 0.809 | 0.871 | 0.857 | MSTSP25 | 0.725 | 0.270 | 0.747 | 0.859 |
MSTSP13 | 0.752 | 0.792 | 0.922 | 0.983 |
The results of the proposed method is based on the repeated experiments. Figure 12 gives the box plots of $ F_\beta $, DI, and number of found solutions on 25 MSTSPs.
In this article, we proposed a new multi-solution optimization algorithm named "CJD-MSO" to address the MSTSPs. In the algorithm, we proposed a novel niching technique named "distancing" method, together with a novel similarity metric named "circular Jaccard distance", to maintain population diversity during the course of searching multiple optimal solutions. The "distancing" strategy, different from all niching method emerge so far, were easy to implement. And " circular Jaccard distance" played an essential role in improving the performance of multi-solution optimization.
We also employed basic ACO as the base metaheuristic and employed the 2-opt strategy as the local search method in CJD-MSO. We laid greater emphasis on the strategy of niching and metric definition than compare many and various classic metaheuristics or their improved versions. Experimental results in 25 MSTSP instances showed that CJD-MSO significantly outperforms other multi-solution optimization algorithms, in terms of $ F_\beta $ and DI.
We believe that the " circular Jaccard distance" metric and the "distancing" strategy can be applied to other base metaheuristics for addressing some other combinatorial optimization problems.
This work was supported by Ministry of Education in China (MOE) under project grants of Humanities and Social Sciences (No.21YJAZH040).
The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.
[1] | Gardella TJ, Juppner H (2001) Molecular properties of the PTH/PTHrP receptor. Trends Endocrinol Metab 12(5): p. 210-217. |
[2] | Gensure RC, Gardella TJ, Juppner H (2005) Parathyroid hormone and parathyroid hormone-related peptide, and their receptors. Biochem Biophys Res Commun 328(3): p. 666-678. |
[3] | Guerreiro PM, Renfro JL, DM Power, et al. (2007) The parathyroid hormone family of peptides: structure, tissue distribution, regulation, and potential functional roles in calcium and phosphate balance in fish. Am J Physiol Regul Integr Comp Physiol 292(2): p. R679-696. |
[4] | Jerome CP, DB Burr, Bibber TV, et al. (2001) Treatment with human parathyroid hormone (1-34) for 18 months increases cancellous bone volume and improves trabecular architecture in ovariectomized cynomolgus monkeys (Macaca fascicularis). Bone 28(2): p. 150-159. |
[5] | Kronenberg HM (2003) Developmental regulation of the growth plate. Nature 423(6937): p. 332-336. |
[6] | Philbrick WM, Wysolmerski J J, Galbraith S, et al. (1996) Defining the roles of parathyroid hormone-related protein in normal physiology. Physiol Rev 76(1): p. 127-173. |
[7] | On JS, Chow BK, Lee LT (2015) Evolution of parathyroid hormone receptor family and their ligands in vertebrate. Front Endocrinol (Lausanne) 6: p. 28. |
[8] | John MR, Arai M, Rubin DA, et al. (2002) Identification and characterization of the murine and human gene encoding the tuberoinfundibular peptide of 39 residues. Endocrin 143(3): p. 1047-1057. |
[9] | Papasani MR, Robert CG, John HB, et al. (2004) Identification and characterization of the zebrafish and fugu genes encoding tuberoinfundibular peptide 39. Endocrin 145(11): p. 5294-5304. |
[10] | Gensure RC, Cooper WC, Nickols GA, et al. (2004) Identification and characterization of two parathyroid hormone-like molecules in zebrafish. Endocrin 145(4): p. 1634-9. |
[11] | Shoemaker JM, Riley LG, Hirano T, et al. (2005) Differential expression of tuberoinfundibular peptide 38 and glucose-6-phosphatase in tilapia. Gen Comp Endocrinol 146(2): p. 186-94. |
[12] | Bhattacharya P, Yan Y-L, David AR, et al. (2001) Evolution of the vertebrate pth2 (tip39) gene family and the regulation of PTH type 2 receptor (pth2r) and its endogenous ligand pth2 by hedgehog signaling in zebrafish development. J Endocrinol 211(2): p. 187-200. |
[13] | Brommage R, Lees CG, Hotchkiss CE, et al. (1999) Daily treatment with human recombinant parathyroid hormone-(1-34), LY333334, for 1 year increases bone mass in ovariectomized monkeys. J Clin Endocrinol Metab 84(10): p. 3757-3763. |
[14] | Brown EM (1999) Physiology and pathophysiology of the extracellular calcium-sensing receptor. Am J Med 106(2): p. 238-253. |
[15] | Gunther T, Chen ZF, Kim G, et al. (2000) Genetic ablation of parathyroid glands reveals another source of parathyroid hormone. Nature 406(6792): p. 199-203. |
[16] | Tucci J, Russell A, Senior SV, et al. (1996) The expression of parathyroid hormone and parathyroid hormone-related protein in developing rat parathyroid glands. J Mol Endocrinol 17(2): p. 149-157. |
[17] | Harvey S, Hayer S, Sloley BD (1993) Dopaminergic actions of parathyroid hormone in the rat medial basal hypothalamus in vitro. Regul Pept 43(1-2): p. 49-56. |
[18] | Henriksen K, Neutzsky-Wulff AV, Bonewald LF, et al. (2009) Local communication on and within bone controls bone remodeling. Bone 44(6): p. 1026-1033. |
[19] | Hayden RS, Fortin JP, Harwood B, et al. (2014) Cell-tethered ligands modulate bone remodeling by osteoblasts and osteoclasts. Adv Funct Mater 24(4): p. 472-479. |
[20] | Sims NA, Vrahnas C (2014) Regulation of cortical and trabecular bone mass by communication between osteoblasts, osteocytes and osteoclasts. Arch Biochem Biophys 561: p. 22-28. |
[21] | Chen H, Senda T, Kubo KY (2015) The osteocyte plays multiple roles in bone remodeling and mineral homeostasis. Med Mol Morphol 48(2): p. 61-68. |
[22] | Danks JA, Damian GD, Gunn HJ, et al. (2011) Evolution of the parathyroid hormone family and skeletal formation pathways. Gen Comp Endocrinol 170(1): p. 79-91. |
[23] | Han SW, Kim SJ, Lee DJ, et al. (2014) The Relationship between Serum 25-Hydroxyvitamin D, Parathyroid Hormone and the Glomerular Filtration Rate in Korean Adults: The Korea National Health and Nutrition Examination Survey between 2009 and 2011. Korean J Fam Med 35(2): p. 98-106. |
[24] | Steingrimsdottir L, Gunnarsson O, Indridason OS, et al. (2005) Relationship between serum parathyroid hormone levels, vitamin D sufficiency, and calcium intake. Jama 294(18): p. 2336-2341. |
[25] | Stewart AF, Horst R, Deftos LJ, et al. (1980) Biochemical evaluation of patients with cancer-associated hypercalcemia: evidence for humoral and nonhumoral groups. N Engl J Med 303(24): p. 1377-1383. |
[26] | de la Mata J, Mundy GR, Guise TA, et al. (1995) Interleukin-6 enhances hypercalcemia and bone resorption mediated by parathyroid hormone-related protein in vivo. J Clin Invest 95(6): p. 2846-2852. |
[27] | Schipani E, Provot S (2003) PTHrP, PTH, and the PTH/PTHrP receptor in endochondral bone development. Birth Defects Res C Embryo Today 69(4): p. 352-362. |
[28] | Lanske B, Pajevic PD, Kovacs CS, et al. (1998) The parathyroid hormone (PTH)/PTH-related peptide receptor mediates actions of both ligands in murine bone. Endocrin 139(12): p. 5194-5204. |
[29] | Karaplis AC, Kronenberg HM, Mulligan RC, et al. (1994) Lethal skeletal dysplasia from targeted disruption of the parathyroid hormone-related peptide gene. Genes Dev 8(3): p. 277-289. |
[30] | Ongkeko WM, Burton D, Kiang A, et al. (2014) Parathyroid hormone related-protein promotes epithelial-to-mesenchymal transition in prostate cancer. PLoS One 9(1): p. e85803. |
[31] | Thiery JP (2002) Epithelial-mesenchymal transitions in tumour progression. Nat Rev Cancer 2(6): p. 442-454. |
[32] | Yilmaz M, Christofori G (2009) EMT, the cytoskeleton, and cancer cell invasion. Cancer Metastasis Rev 28(1-2): p. 15-33. |
[33] | McCauley LK, Martin TJ (2012) Twenty-five years of PTHrP progress: from cancer hormone to multifunctional cytokine. J Bone Miner Res 27(6): p. 1231-1239. |
[34] | Agouni A, Sourbier C, Danilin S, et al. (2007) Parathyroid hormone-related protein induces cell survival in human renal cell carcinoma through the PI3K Akt pathway: evidence for a critical role for integrin-linked kinase and nuclear factor kappa B. Carcinogenesis 28(9): p. 1893-1901. |
[35] | Clemens TL, Cormier S, Eichinger A, et al. (2001) Parathyroid hormone-related protein and its receptors: nuclear functions and roles in the renal and cardiovascular systems, the placental trophoblasts and the pancreatic islets. Br J Pharmacol 134(6): p. 1113-1136. |
[36] | Goomer RS, Shen X, Falzon M, et al. (2000) The tetrabasic KKKK(147-150) motif determines intracrine regulatory effects of PthrP 1-173 on chondrocyte PPi metabolism and matrix synthesis. Endocrin 141(12): p. 4613-4622. |
[37] | Watson PH, Hodsman AB, Fraher LJ, et al. (2000) Nuclear localization of the type 1 PTH/PTHrP receptor in rat tissues. J Bone Miner Res 15(6): p. 1033-1044. |
[38] | Juppner H, Kronenberg HM, Segre GV, et al. (1991) A G protein-linked receptor for parathyroid hormone and parathyroid hormone-related peptide. Science 254(5034): p. 1024-1026. |
[39] | Pilz P, Meyer-Marcotty P, Eigenthaler M, et al. (2014) Differential diagnosis of primary failure of eruption (PFE) with and without evidence of pathogenic mutations in the PTHR1 gene. J Orofac Orthop 75(3): p. 226-239. |
[40] | Piserchio A, Usdin T, Mierke DF (2000) Structure of tuberoinfundibular peptide of 39 residues. J Biol Chem 275(35): p. 27284-27290. |
[41] | Jonsson KB, John MR, Gensure R, et al. (2001) Tuberoinfundibular peptide 39 binds to the parathyroid hormone (PTH)/PTH-related peptide receptor, but functions as an antagonist. Endocrin 142(2): p. 704-709. |
[42] | Gardella TJ, et al. (1996) Converting parathyroid hormone-related peptide (PTHrP) into a potent PTH-2 receptor agonist. J Biol Chem 271(33): p. 19888-93. |
[43] | Tenne M, McGuigan F, Jansson L, et al. (2008) Genetic variation in the PTH pathway and bone phenotypes in elderly women: evaluation of PTH, PTHLH, PTHR1 and PTHR2 genes. Bone 42(4): p. 719-727. |
[44] | Usdin TB, Gruber C, Bonner TI (1995) Identification and functional expression of a receptor selectively recognizing parathyroid hormone, the PTH2 receptor. J Biol Chem 270(26): p. 15455-15458. |
[45] | Usdin TB, Hoare SR, Wang T, et al. (1999) TIP39: a new neuropeptide and PTH2-receptor agonist from hypothalamus. Nat Neurosci 2(11): p. 941-943. |
[46] | Usdin TB (1997) Evidence for a parathyroid hormone-2 receptor selective ligand in the hypothalamus. Endocrin 138(2): p. 831-834. |
[47] | Usdin TB, Modi W, Bonner TI (1996) Assignment of the human PTH2 receptor gene (PTHR2) to chromosome 2q33 by fluorescence in situ hybridization. Genomics 37(1): p. 140-141. |
[48] | Gardella TJ, Vilardaga JP (2015) International Union of Basic and Clinical Pharmacology. XCIII. The parathyroid hormone receptors-family B G protein-coupled receptors. Pharmacol Rev 67(2): p. 310-337. |
[49] | Dobolyi A, Palkovits M, Usdin TB (2003) Expression and distribution of tuberoinfundibular peptide of 39 residues in the rat central nervous system. J Comp Neurol 455(4): p. 547-566. |
[50] | Dobolyi A, Dimitiov E, Palkovits M, et al. (2012) The neuroendocrine functions of the parathyroid hormone 2 receptor. Front Endocrinol (Lausanne) 3: p. 121. |
[51] | Hoare SR, Rubin DA, Jüppner H, et al. (2000) Evaluating the ligand specificity of zebrafish parathyroid hormone (PTH) receptors: comparison of PTH, PTH-related protein, and tuberoinfundibular peptide of 39 residues. Endocrin 141(9): p. 3080-3086. |
[52] | Eichinger A (2002) Transcript expression of the tuberoinfundibular peptide (TIP)39/PTH2 receptor system and non-PTH1 receptor-mediated tonic effects of TIP39 and other PTH2 receptor ligands in renal vessels. Endocrin 143(8): p. 3036-3043. |
[53] | Pinheiro PL, Cardoso JCR, Power DM, et al. (2012) Functional characterization and evolution of PTH/PTHrP receptors: insights from the chicken. BMC Evol Biol 12: p. 110. |
[54] | Rubin DA, Juppner H (1999) Zebrafish express the common parathyroid hormone/parathyroid hormone-related peptide receptor (PTH1R) and a novel receptor (PTH3R) that is preferentially activated by mammalian and fugufish parathyroid hormone-related peptide. J Biol Chem 274(40): p. 28185-28190. |
[55] | Rubin DA, Hellman P, Zon LI, et al. (1999) A G protein-coupled receptor from zebrafish is activated by human parathyroid hormone and not by human or teleost parathyroid hormone-related peptide. Implications for the evolutionary conservation of calcium-regulating peptide hormones. J Biol Chem 274(33): p. 23035-23042. |
[56] | Rotllant J, Redruello B, Power DM, et al. (2006) Ligand binding and signalling pathways of PTH receptors in sea bream (Sparus auratus) enterocytes. Cell Tissue Res 323(2): p. 333-341. |
[57] | Pinheiro PL, Gomes AS, Power DM, et al. (2010) Gene structure, transcripts and calciotropic effects of the PTH family of peptides in Xenopus and chicken. BMC Evol Biol 10: p. 373. |
[58] | Canario AV, Rotllant J, Power DM, et al. (2006) Novel bioactive parathyroid hormone and related peptides in teleost fish. FEBS Lett 580(1): p. 291-299. |
[59] | Cardoso JC, Felix RC, Power DM (2014) Nematode and arthropod genomes provide new insights into the evolution of class 2 B1 GPCRs. PLoS One 9(3): p. e92220. |
[60] | D'Souza DG, Rana K, Milley KM, et al. (2013) Expression of Wnt signaling skeletal development genes in the cartilaginous fish, elephant shark (Callorhinchus milii). Gen Comp Endocrin 193: p. 1-9. |
[61] | Hwang JI, Moon MJ, Park M, et al. (2013) Expansion of secretin-like G protein-coupled receptors and their peptide ligands via local duplications before and after two rounds of whole-genome duplication. Mol Biol Evo 30(5): p. 1119-1130. |
[62] | On JS, Duan C, Chow BK, et al. (2015) Functional Pairing of Class B1 Ligand-GPCR in Cephalochordate Provides Evidence of the Origin of PTH and PACAP/Glucagon Receptor Family. Mol Biol Evol |
[63] | Cardoso JC, Pinto1 VC, Vieira1 FA, et al. (2006) Evolution of secretin family GPCR members in the metazoa. BMC Evol Biol 6: p. 108. |
[64] | Mirabeau O, Joly JS (2013) Molecular evolution of peptidergic signaling systems in bilaterians. Proc Natl Acad Sci USA 110(22): p. E2028-2037. |
[65] | Liu Y, Ibrahim AS, Walker TI, et al. (2010) Parathyroid hormone gene family in a cartilaginous fish, the elephant shark (Callorhinchus milii). J Bone Miner Res 25(12): p. 2613-2623. |
[66] | Ingleton PM (2002) Parathyroid hormone-related protein in lower vertebrates. Comp Biochem Physiol B Biochem Mol Biol 132(1): p. 87-95. |
[67] | Ingleton PM, Danks JA (1996) Distribution and functions of parathyroid hormone-related protein in vertebrate cells. Int Rev Cytol 166: p. 231-280. |
[68] | Abbink W, Flik G (2007) Parathyroid hormone-related protein in teleost fish. Gen Comp Endocrinol 152(2-3): p. 243-251. |
[69] | Barden JA, Cuthbertson RM (1993) Stabilized NMR structure of human parathyroid hormone(1-34). Eur J Biochem 215(2): p. 315-321. |
[70] | Ray FR, Barden JA, Kemp BE (1993) NMR solution structure of the [Ala26] parathyroid-hormone-related protein(1-34) expressed in humoral hypercalcemia of malignancy. Eur J Biochem 211(1-2): p. 205-211. |
[71] | Shimizu N, Petroni BD, Khatri A, et al. (2003) Functional evidence for an intramolecular side chain interaction between residues 6 and 10 of receptor-bound parathyroid hormone analogues. Biochem 42(8): p. 2282-2290. |
[72] | Blind E, Raue F, Knappe F, et al. (1993) Cyclic AMP formation in rat bone and kidney cells is stimulated equally by parathyroid hormone-related protein (PTHrP) 1-34 and PTH 1-34. Exp Clin Endocrinol 101(3): p. 150-155. |
[73] | Neer RM, Arnaud CD, Zanchetta JR, et al. (2001) Effect of parathyroid hormone (1-34) on fractures and bone mineral density in postmenopausal women with osteoporosis. N Engl J Med 344(19): p. 1434-1441. |
[74] | Nutt RF, Caulfield MP, Levy JJ, et al. (1990) Removal of partial agonism from parathyroid hormone (PTH)-related protein-(7-34)NH2 by substitution of PTH amino acids at positions 10 and 11. Endocrin 127(1): p. 491-493. |
[75] | Carter PH, Dean T, Gardella TJ, et al. (2015) Actions of the small molecule ligands SW106 and AH-3960 on the type-1 parathyroid hormone receptor. Mol Endocrinol. 29(2): p. 307-321. |
[76] | Mann R, Wigglesworth MJ, Donnelly D (2008) Ligand-receptor interactions at the parathyroid hormone receptors: subtype binding selectivity is mediated via an interaction between residue 23 on the ligand and residue 41 on the receptor. Mol Pharmacol 74(3): p. 605-613. |
[77] | Pioszak AA, Xu HE (2008) Molecular recognition of parathyroid hormone by its G protein-coupled receptor. Proc Natl Acad Sci USA 105(13): p. 5034-5039. |
[78] | Barbier JR, MacLean S, Whitfield JF, et al. (2001) Structural requirements for conserved arginine of parathyroid hormone. Biochemistry 40(30): p. 8955-8961. |
[79] | Dean T, Khatri A, Gardella TJ, et al. (2006) Role of amino acid side chains in region 17-31 of parathyroid hormone (PTH) in binding to the PTH receptor. J Biol Chem 281(43): p. 32485-32495. |
[80] | Weaver RE, Wigglesworth MJ, Donnelly D (2014) A salt bridge between Arg-20 on parathyroid hormone (PTH) and Asp-137 on the PTH1 receptor is essential for full affinity. Peptides 61: p. 83-87. |
[81] | Pizurki L, Rizzoli R, Bonjour JP (1990) Inhibition by (D-Trp12,Tyr34)bPTH(7-34)amide of PTH and PTHrP effects on Pi transport in renal cells. Am J Physiol 259(2 Pt 2): p. F389-392. |
[82] | Carter PH, Juppner H, Gardella TJ (1999) Studies of the N-terminal region of a parathyroid hormone-related peptide (1-36) analog: receptor subtype-selective agonists, antagonists, and photochemical cross-linking agents. Endocrin 140(11): p. 4972-4981. |
[83] | Cohen FE, Strewler GJ, Bradley MS, et al. (1991) Analogues of parathyroid hormone modified at positions 3 and 6. Effects on receptor binding and activation of adenylyl cyclase in kidney and bone. J Biol Chem 266(3): p. 1997-2004. |
[84] | Whitfield JF, Morley P (1995) Small bone-building fragments of parathyroid hormone: new therapeutic agents for osteoporosis. Trends Pharmacol Sci 16(11): p. 382-386. |
[85] | Whitfield JF, Bringhurst FR (2000) Lactam formation increases receptor binding, adenylyl cyclase stimulation and bone growth stimulation by human parathyroid hormone (hPTH)(1-28)NH2. J Bone Miner Res 15(5): p. 964-970. |
[86] | Luck MD, Carter PH, Gardella TJ (1999) The (1-14) fragment of parathyroid hormone (PTH) activates intact and amino-terminally truncated PTH-1 receptors. Mol Endocrinol 13(5): p. 670-680. |
[87] | Shimizu M, Potts JT, Gardella TJ, et al. (2000) Minimization of parathyroid hormone. Novel amino-terminal parathyroid hormone fragments with enhanced potency in activating the type-1 parathyroid hormone receptor. J Biol Chem 275(29): p. 21836-21843. |
[88] | Liu Y, Cai Y, Liu W, et al. (2015) Triblock peptide-linker-lipid molecular design improves potency of peptide ligands targeting family B G protein-coupled receptors. Chem Commun (Camb) 51(28): p. 6157-6160. |
[89] | Neer M, Slovik DM, DalyM, et al. (1993) Treatment of postmenopausal osteoporosis with daily parathyroid hormone plus calcitriol. Osteoporos Int 3 Suppl 1: p. 204-205. |
[90] | Reeve J, Hesp R, Williams D, et al. (1976) Anabolic effect of low doses of a fragment of human parathyroid hormone on the skeleton in postmenopausal osteoporosis. Lancet 1(7968): p. 1035-1038. |
[91] | Divieti P, Geller AI, Suliman G, et al. (2005) Receptors specific for the carboxyl-terminal region of parathyroid hormone on bone-derived cells: determinants of ligand binding and bioactivity. Endocrin 146(4): p. 1863-1870. |
[92] | Inomata N, Akiyama M, Kubota N, et al. (1995) Characterization of a novel parathyroid hormone (PTH) receptor with specificity for the carboxyl-terminal region of PTH-(1-84). Endocrin 136(11): p. 4732-4740. |
[93] | Venkatesh B, Lee AP, Ravi V, et al. (2014) Elephant shark genome provides unique insights into gnathostome evolution. Nature 505(7482): p. 174-179. |
1. | Y.P. Ragini, Jeyanthi Palanivelu, R.V. Hemavathy, Critical analysis of enhanced microbial bioremediation strategies of PAHs contaminated sites: Toxicity and techno-economic analysis, 2024, 27, 2352801X, 101369, 10.1016/j.gsd.2024.101369 | |
2. | Iglė Vepštaitė‐Monstavičė, Juliana Lukša, Živilė Strazdaitė‐Žielienė, Saulius Serva, Elena Servienė, Distinct microbial communities associated with health‐relevant wild berries, 2024, 16, 1758-2229, 10.1111/1758-2229.70048 | |
3. | Rashmi Ranjan Mandal, Zahid Bashir, Deep Raj, Microbe-assisted phytoremediation for sustainable management of heavy metal in wastewater - A green approach to escalate the remediation of heavy metals, 2025, 375, 03014797, 124199, 10.1016/j.jenvman.2025.124199 | |
4. | Yuniar Harvianti, , The Potential of Immobilized Bacteria for Pollutant Bioremediation in The Environment: Systematic Review, 2025, 11, 2477-1392, 67, 10.24233/biov.11.1.2025.477 | |
5. | Patrick Othuke Akpoghelie, Great Iruoghene Edo, Alice Njolke Mafe, Endurance Fegor Isoje, Ufuoma Augustina Igbuku, Ali B. M. Ali, Emad Yousif, Joseph Oghenewogaga Owheruo, Splendour Oberhiri Oberhiri, Arthur Efeoghene Athan Essaghah, Dina S. Ahmed, Huzaifa Umar, Ahmed A. Alamiery, Food, Health, and Environmental Impact of Lactic Acid Bacteria: The Superbacteria for Posterity, 2025, 1867-1306, 10.1007/s12602-025-10546-x | |
6. | Alessia Tropea, Daniele Giuffrida, Cassamo U. Mussagy, Laurent Dufossé, Luigi Mondello, 2025, Chapter 6, 978-3-031-82177-6, 83, 10.1007/978-3-031-82178-3_6 | |
7. | Manswama Boro, Dixita Chettri, Sushruta Bhadra, Anil Kumar Verma, An insight into pollutants and their degradation by microbial enzymes, 2025, 1024-2422, 1, 10.1080/10242422.2025.2499004 |
Algorithm 1 Solving TSP using the ACO algorithm |
Input: Number of ants $ m $, pheromone factor $ \alpha $, heuristic function factor $ \beta $, pheromone volatilization coefficient $ \rho $, and maximum number of iterations Output: An optimal solution 1: Initialize ant colony parameters and pheromone matrix 2: while not meet the stopping criteria do 3: Randomly place each ant at a different starting point 4: For each ant $ k $ ($ k = 1, 2, \cdots, m $), calculate the next vertex to be visited using the transition probability formula, until all ants have visited all the vertexes 5: Calculate the path length $ L $ passed by each ant, and save the optimal solution (shortest path) 6: Update the pheromone concentration on all paths 7: end while |
Algorithm 2 Crowding |
Input: Population $ NP $ Output: A set of optimal solutions 1: Randomly generate individuals of $ NP $ and a set of $ CF $ values 2: while not meet the stopping criteria do 3: Calculate the fitness of each individual 4: Generate candidate individuals through the basic operation of the algorithm 5: Randomly select $ CF $ individuals from the parents 6: Compare the candidate individual with the closest individual among $ CF $ individuals, and replace the closest individual if the candidate individual has better fitness 7: end while |
Algorithm 3 Speciation |
Input: Population and niching radius $ \sigma $ Output: A set of optimal solutions 1: Initialize population 2: Calculate the fitness of the population and sort them according to the fitness value 3: while population is not empty do 4: Find the best individual $ seed $ in the population 5: Find all individuals whose distance to $ seed $ is less than $ \sigma $, and form a subpopulation with $ seed $ 6: Remove the subpopulation from the population 7: end while |
Algorithm 4 $ K $-means algorithm for niching |
Input: Population $ NP $ and clustering number $ K $ Output: $ K $ species 1: Initialize population 2: Randomly select $ K $ individuals from the population as the centers 3: while not meet the stopping criteria do 4: Calculate the distance between other individuals in the population and these $ K $ centers 5: Assign other individuals to the nearest cluster according to distance 6: Recalculate and update the centers in the population 7: end while |
Algorithm 5 The multi-solution optimization algorithm with distancing |
Input: The number of ants, iteration bound, $ \alpha $, $ \beta $, and $ \rho $ Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Construct a path through pseudo-random selection rules to obtain the first generation parent population 3: while not meet the stopping criteria do 4: Discard new individual that have a high similarity to the previous one, and form a new offspring population from the remaining individuals 5: Update the pheromone matrix 6: Calculate the fitness of the parent population together with the offspring population, and sort them 7: Select a set of good individuals from the queue head to form a new parent population 8: end while |
Algorithm 6 The CJD-MSO algorithm |
Input: The number of ants $ m $, iteration bound, and the parameters of CJD-MSO ($ \alpha $, $ \beta $ and $ \rho $) Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Randomly place each ant at a different starting point 3: For each ant $ k (k = 1, 2, …, m) $, calculate the next city to be visited according to the transition formula obtained by the roulette method, until all ants have visited all the cities 4: Obtain the first generation parent population through the 2-opt strategy 5: while not meet the stopping criteria do 6: Randomly place each ant at a different starting point 7: For each ant $ k (k = 1, 2, …, m) $, calculate the next city to be visited according to the transition probability formula obtained by the roulette method, until all ants have visited all the cities 8: For each new population, 2-opt algorithm is used to improve the quality of solutions 9: Calculating the circular Jaccard distance between the new individual path and latest individual path 10: if the similarity is higher than a preset value then 11: Save the new individual path 12: else 13: Discard the new individual path 14: end if 15: Obtain the offspring population, and update the pheromone matrix 16: Mix the parent population with the offspring population 17: Sort them and select some good individuals from the queue to form the next parent population 18: Save the optimal paths obtained so far 19: end while |
Name | City nums | Optimal solutions nums | Shortest path length | Name | City nums | Optimal solutions nums | Shortest path length |
MSTSP1 | 9 | 3 | 680 | MSTSP14 | 34 | 16 | 3575 |
MSTSP2 | 10 | 4 | 1265 | MSTSP15 | 22 | 72 | 9455 |
MSTSP3 | 10 | 13 | 832 | MSTSP16 | 33 | 64 | 8761 |
MSTSP4 | 11 | 4 | 803 | MSTSP17 | 35 | 10 | 9061 |
MSTSP5 | 12 | 2 | 754 | MSTSP18 | 39 | 20 | 23763 |
MSTSP6 | 12 | 4 | 845 | MSTSP19 | 42 | 20 | 14408 |
MSTSP7 | 10 | 56 | 130 | MSTSP20 | 45 | 20 | 10973 |
MSTSP8 | 12 | 110 | 1344 | MSTSP21 | 48 | 4 | 6767 |
MSTSP9 | 10 | 4 | 72 | MSTSP22 | 55 | 9 | 10442 |
MSTSP10 | 10 | 4 | 72 | MSTSP23 | 59 | 10 | 24451 |
MSTSP11 | 10 | 14 | 78 | MSTSP24 | 60 | 36 | 9614 |
MSTSP12 | 15 | 196 | 130 | MSTSP25 | 66 | 26 | 9521 |
MSTSP13 | 28 | 70 | 3055 |
NAME | PCC | DC | CES | JD | NAME | PCC | DC | CES | JD |
MSTSP1 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP14 | 0.813 | 0.905 | 0.766 | 0.905 |
MSTSP2 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP15 | 0.684 | 0.804 | 0.679 | 0.795 |
MSTSP3 | 0.835 | 0.835 | 0.993 | 0.835 | MSTSP16 | 0.629 | 0.748 | 0.586 | 0.760 |
MSTSP4 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP17 | 0.325 | 0.650 | 0.382 | 0.650 |
MSTSP5 | 1.000 | 1.000 | 0.907 | 1.000 | MSTSP18 | 0.000 | 0.000 | 0.224 | 0.000 |
MSTSP6 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.700 | 0.237 | 0.700 |
MSTSP7 | 0.751 | 0.690 | 0.879 | 0.690 | MSTSP20 | 0.160 | 0.130 | 0.172 | 0.520 |
MSTSP8 | 0.619 | 0.520 | 0.801 | 0.520 | MSTSP21 | 0.813 | 0.813 | 0.120 | 0.813 |
MSTSP9 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.000 | 0.000 |
MSTSP10 | 0.929 | 1.000 | 1.000 | 0.591 | MSTSP23 | 0.000 | 0.333 | 0.000 | 0.000 |
MSTSP11 | 0.941 | 0.982 | 0.967 | 0.983 | MSTSP24 | 0.188 | 0.207 | 0.118 | 0.313 |
MSTSP12 | 0.291 | 0.292 | 0.671 | 0.315 | MSTSP25 | 0.000 | 0.069 | 0.000 | 0.000 |
MSTSP13 | 0.743 | 0.775 | 0.743 | 0.950 | BRPS | 7 | 14 | 9 | 15 |
NAME | JD | CJD | NAME | JD | CJD |
MSTSP1 | 1.000 | 1.000 | MSTSP14 | 0.905 | 0.942 |
MSTSP2 | 1.000 | 1.000 | MSTSP15 | 0.795 | 0.823 |
MSTSP3 | 0.835 | 0.875 | MSTSP16 | 0.760 | 0.760 |
MSTSP4 | 1.000 | 1.000 | MSTSP17 | 0.650 | 0.710 |
MSTSP5 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.000 |
MSTSP6 | 1.000 | 1.000 | MSTSP19 | 0.700 | 0.642 |
MSTSP7 | 0.690 | 0.782 | MSTSP20 | 0.520 | 0.349 |
MSTSP8 | 0.520 | 0.752 | MSTSP21 | 0.813 | 0.813 |
MSTSP9 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 |
MSTSP10 | 0.591 | 1.000 | MSTSP23 | 0.000 | 0.175 |
MSTSP11 | 0.983 | 0.988 | MSTSP24 | 0.313 | 0.299 |
MSTSP12 | 0.315 | 0.547 | MSTSP25 | 0.000 | 0.062 |
MSTSP13 | 0.950 | 0.940 | BRPS | 12 | 19 |
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.684 | 0.973 | 1.000 | 1.000 | MSTSP14 | 0.087 | 0.172 | 0.883 | 0.942 |
MSTSP2 | 0.804 | 0.959 | 1.000 | 1.000 | MSTSP15 | 0.004 | 0.416 | 0.732 | 0.823 |
MSTSP3 | 0.497 | 0.936 | 1.000 | 0.875 | MSTSP16 | 0.000 | 0.054 | 0.554 | 0.760 |
MSTSP4 | 0.724 | 0.932 | 1.000 | 1.000 | MSTSP17 | 0.000 | 0.044 | 0.605 | 0.710 |
MSTSP5 | 0.989 | 0.846 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.031 | 0.571 | 0.000 |
MSTSP6 | 0.643 | 0.877 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.007 | 0.168 | 0.642 |
MSTSP7 | 0.125 | 0.769 | 0.923 | 0.782 | MSTSP20 | 0.000 | 0.000 | 0.165 | 0.349 |
MSTSP8 | 0.137 | 0.578 | 0.772 | 0.752 | MSTSP21 | 0.012 | 0.000 | 0.023 | 0.813 |
MSTSP9 | 0.768 | 0.974 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.013 | 0.000 |
MSTSP10 | 0.813 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.000 | 0.000 | 0.016 | 0.175 |
MSTSP11 | 0.459 | 0.949 | 1.000 | 0.988 | MSTSP24 | 0.000 | 0.000 | 0.010 | 0.299 |
MSTSP12 | 0.090 | 0.331 | 0.535 | 0.547 | MSTSP25 | 0.000 | 0.000 | 0.002 | 0.062 |
MSTSP13 | 0.025 | 0.096 | 0.611 | 0.940 | BRPS | 0 | 0 | 13 | 19 |
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.788 | 0.980 | 1.000 | 1.000 | MSTSP14 | 0.876 | 0.844 | 0.981 | 0.988 |
MSTSP2 | 0.894 | 0.972 | 1.000 | 1.000 | MSTSP15 | 0.744 | 0.847 | 0.932 | 0.902 |
MSTSP3 | 0.757 | 0.957 | 1.000 | 0.916 | MSTSP16 | 0.680 | 0.783 | 0.926 | 0.918 |
MSTSP4 | 0.809 | 0.947 | 1.000 | 1.000 | MSTSP17 | 0.765 | 0.803 | 0.944 | 0.881 |
MSTSP5 | 0.983 | 0.916 | 1.000 | 1.000 | MSTSP18 | 0.671 | 0.704 | 0.932 | 0.737 |
MSTSP6 | 0.843 | 0.943 | 1.000 | 1.000 | MSTSP19 | 0.675 | 0.699 | 0.865 | 0.862 |
MSTSP7 | 0.566 | 0.869 | 0.945 | 0.865 | MSTSP20 | 0.745 | 0.671 | 0.865 | 0.826 |
MSTSP8 | 0.652 | 0.838 | 0.898 | 0.838 | MSTSP21 | 0.773 | 0.628 | 0.834 | 0.870 |
MSTSP9 | 0.820 | 0.975 | 1.000 | 1.000 | MSTSP22 | 0.713 | 0.409 | 0.816 | 0.874 |
MSTSP10 | 0.850 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.671 | 0.344 | 0.829 | 0.839 |
MSTSP11 | 0.758 | 0.963 | 1.000 | 0.990 | MSTSP24 | 0.724 | 0.319 | 0.811 | 0.897 |
MSTSP12 | 0.732 | 0.809 | 0.871 | 0.857 | MSTSP25 | 0.725 | 0.270 | 0.747 | 0.859 |
MSTSP13 | 0.752 | 0.792 | 0.922 | 0.983 |
Algorithm 1 Solving TSP using the ACO algorithm |
Input: Number of ants $ m $, pheromone factor $ \alpha $, heuristic function factor $ \beta $, pheromone volatilization coefficient $ \rho $, and maximum number of iterations Output: An optimal solution 1: Initialize ant colony parameters and pheromone matrix 2: while not meet the stopping criteria do 3: Randomly place each ant at a different starting point 4: For each ant $ k $ ($ k = 1, 2, \cdots, m $), calculate the next vertex to be visited using the transition probability formula, until all ants have visited all the vertexes 5: Calculate the path length $ L $ passed by each ant, and save the optimal solution (shortest path) 6: Update the pheromone concentration on all paths 7: end while |
Algorithm 2 Crowding |
Input: Population $ NP $ Output: A set of optimal solutions 1: Randomly generate individuals of $ NP $ and a set of $ CF $ values 2: while not meet the stopping criteria do 3: Calculate the fitness of each individual 4: Generate candidate individuals through the basic operation of the algorithm 5: Randomly select $ CF $ individuals from the parents 6: Compare the candidate individual with the closest individual among $ CF $ individuals, and replace the closest individual if the candidate individual has better fitness 7: end while |
Algorithm 3 Speciation |
Input: Population and niching radius $ \sigma $ Output: A set of optimal solutions 1: Initialize population 2: Calculate the fitness of the population and sort them according to the fitness value 3: while population is not empty do 4: Find the best individual $ seed $ in the population 5: Find all individuals whose distance to $ seed $ is less than $ \sigma $, and form a subpopulation with $ seed $ 6: Remove the subpopulation from the population 7: end while |
Algorithm 4 $ K $-means algorithm for niching |
Input: Population $ NP $ and clustering number $ K $ Output: $ K $ species 1: Initialize population 2: Randomly select $ K $ individuals from the population as the centers 3: while not meet the stopping criteria do 4: Calculate the distance between other individuals in the population and these $ K $ centers 5: Assign other individuals to the nearest cluster according to distance 6: Recalculate and update the centers in the population 7: end while |
Algorithm 5 The multi-solution optimization algorithm with distancing |
Input: The number of ants, iteration bound, $ \alpha $, $ \beta $, and $ \rho $ Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Construct a path through pseudo-random selection rules to obtain the first generation parent population 3: while not meet the stopping criteria do 4: Discard new individual that have a high similarity to the previous one, and form a new offspring population from the remaining individuals 5: Update the pheromone matrix 6: Calculate the fitness of the parent population together with the offspring population, and sort them 7: Select a set of good individuals from the queue head to form a new parent population 8: end while |
Algorithm 6 The CJD-MSO algorithm |
Input: The number of ants $ m $, iteration bound, and the parameters of CJD-MSO ($ \alpha $, $ \beta $ and $ \rho $) Output: A set of optimal solutions 1: Initialize population and pheromone matrix 2: Randomly place each ant at a different starting point 3: For each ant $ k (k = 1, 2, …, m) $, calculate the next city to be visited according to the transition formula obtained by the roulette method, until all ants have visited all the cities 4: Obtain the first generation parent population through the 2-opt strategy 5: while not meet the stopping criteria do 6: Randomly place each ant at a different starting point 7: For each ant $ k (k = 1, 2, …, m) $, calculate the next city to be visited according to the transition probability formula obtained by the roulette method, until all ants have visited all the cities 8: For each new population, 2-opt algorithm is used to improve the quality of solutions 9: Calculating the circular Jaccard distance between the new individual path and latest individual path 10: if the similarity is higher than a preset value then 11: Save the new individual path 12: else 13: Discard the new individual path 14: end if 15: Obtain the offspring population, and update the pheromone matrix 16: Mix the parent population with the offspring population 17: Sort them and select some good individuals from the queue to form the next parent population 18: Save the optimal paths obtained so far 19: end while |
Name | City nums | Optimal solutions nums | Shortest path length | Name | City nums | Optimal solutions nums | Shortest path length |
MSTSP1 | 9 | 3 | 680 | MSTSP14 | 34 | 16 | 3575 |
MSTSP2 | 10 | 4 | 1265 | MSTSP15 | 22 | 72 | 9455 |
MSTSP3 | 10 | 13 | 832 | MSTSP16 | 33 | 64 | 8761 |
MSTSP4 | 11 | 4 | 803 | MSTSP17 | 35 | 10 | 9061 |
MSTSP5 | 12 | 2 | 754 | MSTSP18 | 39 | 20 | 23763 |
MSTSP6 | 12 | 4 | 845 | MSTSP19 | 42 | 20 | 14408 |
MSTSP7 | 10 | 56 | 130 | MSTSP20 | 45 | 20 | 10973 |
MSTSP8 | 12 | 110 | 1344 | MSTSP21 | 48 | 4 | 6767 |
MSTSP9 | 10 | 4 | 72 | MSTSP22 | 55 | 9 | 10442 |
MSTSP10 | 10 | 4 | 72 | MSTSP23 | 59 | 10 | 24451 |
MSTSP11 | 10 | 14 | 78 | MSTSP24 | 60 | 36 | 9614 |
MSTSP12 | 15 | 196 | 130 | MSTSP25 | 66 | 26 | 9521 |
MSTSP13 | 28 | 70 | 3055 |
NAME | PCC | DC | CES | JD | NAME | PCC | DC | CES | JD |
MSTSP1 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP14 | 0.813 | 0.905 | 0.766 | 0.905 |
MSTSP2 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP15 | 0.684 | 0.804 | 0.679 | 0.795 |
MSTSP3 | 0.835 | 0.835 | 0.993 | 0.835 | MSTSP16 | 0.629 | 0.748 | 0.586 | 0.760 |
MSTSP4 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP17 | 0.325 | 0.650 | 0.382 | 0.650 |
MSTSP5 | 1.000 | 1.000 | 0.907 | 1.000 | MSTSP18 | 0.000 | 0.000 | 0.224 | 0.000 |
MSTSP6 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.700 | 0.237 | 0.700 |
MSTSP7 | 0.751 | 0.690 | 0.879 | 0.690 | MSTSP20 | 0.160 | 0.130 | 0.172 | 0.520 |
MSTSP8 | 0.619 | 0.520 | 0.801 | 0.520 | MSTSP21 | 0.813 | 0.813 | 0.120 | 0.813 |
MSTSP9 | 1.000 | 1.000 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.000 | 0.000 |
MSTSP10 | 0.929 | 1.000 | 1.000 | 0.591 | MSTSP23 | 0.000 | 0.333 | 0.000 | 0.000 |
MSTSP11 | 0.941 | 0.982 | 0.967 | 0.983 | MSTSP24 | 0.188 | 0.207 | 0.118 | 0.313 |
MSTSP12 | 0.291 | 0.292 | 0.671 | 0.315 | MSTSP25 | 0.000 | 0.069 | 0.000 | 0.000 |
MSTSP13 | 0.743 | 0.775 | 0.743 | 0.950 | BRPS | 7 | 14 | 9 | 15 |
NAME | JD | CJD | NAME | JD | CJD |
MSTSP1 | 1.000 | 1.000 | MSTSP14 | 0.905 | 0.942 |
MSTSP2 | 1.000 | 1.000 | MSTSP15 | 0.795 | 0.823 |
MSTSP3 | 0.835 | 0.875 | MSTSP16 | 0.760 | 0.760 |
MSTSP4 | 1.000 | 1.000 | MSTSP17 | 0.650 | 0.710 |
MSTSP5 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.000 |
MSTSP6 | 1.000 | 1.000 | MSTSP19 | 0.700 | 0.642 |
MSTSP7 | 0.690 | 0.782 | MSTSP20 | 0.520 | 0.349 |
MSTSP8 | 0.520 | 0.752 | MSTSP21 | 0.813 | 0.813 |
MSTSP9 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 |
MSTSP10 | 0.591 | 1.000 | MSTSP23 | 0.000 | 0.175 |
MSTSP11 | 0.983 | 0.988 | MSTSP24 | 0.313 | 0.299 |
MSTSP12 | 0.315 | 0.547 | MSTSP25 | 0.000 | 0.062 |
MSTSP13 | 0.950 | 0.940 | BRPS | 12 | 19 |
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.684 | 0.973 | 1.000 | 1.000 | MSTSP14 | 0.087 | 0.172 | 0.883 | 0.942 |
MSTSP2 | 0.804 | 0.959 | 1.000 | 1.000 | MSTSP15 | 0.004 | 0.416 | 0.732 | 0.823 |
MSTSP3 | 0.497 | 0.936 | 1.000 | 0.875 | MSTSP16 | 0.000 | 0.054 | 0.554 | 0.760 |
MSTSP4 | 0.724 | 0.932 | 1.000 | 1.000 | MSTSP17 | 0.000 | 0.044 | 0.605 | 0.710 |
MSTSP5 | 0.989 | 0.846 | 1.000 | 1.000 | MSTSP18 | 0.000 | 0.031 | 0.571 | 0.000 |
MSTSP6 | 0.643 | 0.877 | 1.000 | 1.000 | MSTSP19 | 0.000 | 0.007 | 0.168 | 0.642 |
MSTSP7 | 0.125 | 0.769 | 0.923 | 0.782 | MSTSP20 | 0.000 | 0.000 | 0.165 | 0.349 |
MSTSP8 | 0.137 | 0.578 | 0.772 | 0.752 | MSTSP21 | 0.012 | 0.000 | 0.023 | 0.813 |
MSTSP9 | 0.768 | 0.974 | 1.000 | 1.000 | MSTSP22 | 0.000 | 0.000 | 0.013 | 0.000 |
MSTSP10 | 0.813 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.000 | 0.000 | 0.016 | 0.175 |
MSTSP11 | 0.459 | 0.949 | 1.000 | 0.988 | MSTSP24 | 0.000 | 0.000 | 0.010 | 0.299 |
MSTSP12 | 0.090 | 0.331 | 0.535 | 0.547 | MSTSP25 | 0.000 | 0.000 | 0.002 | 0.062 |
MSTSP13 | 0.025 | 0.096 | 0.611 | 0.940 | BRPS | 0 | 0 | 13 | 19 |
NAME | NACS | NGA | NMA | CJD-MSO | NAME | NACS | NGA | NMA | CJD-MSO |
MSTSP1 | 0.788 | 0.980 | 1.000 | 1.000 | MSTSP14 | 0.876 | 0.844 | 0.981 | 0.988 |
MSTSP2 | 0.894 | 0.972 | 1.000 | 1.000 | MSTSP15 | 0.744 | 0.847 | 0.932 | 0.902 |
MSTSP3 | 0.757 | 0.957 | 1.000 | 0.916 | MSTSP16 | 0.680 | 0.783 | 0.926 | 0.918 |
MSTSP4 | 0.809 | 0.947 | 1.000 | 1.000 | MSTSP17 | 0.765 | 0.803 | 0.944 | 0.881 |
MSTSP5 | 0.983 | 0.916 | 1.000 | 1.000 | MSTSP18 | 0.671 | 0.704 | 0.932 | 0.737 |
MSTSP6 | 0.843 | 0.943 | 1.000 | 1.000 | MSTSP19 | 0.675 | 0.699 | 0.865 | 0.862 |
MSTSP7 | 0.566 | 0.869 | 0.945 | 0.865 | MSTSP20 | 0.745 | 0.671 | 0.865 | 0.826 |
MSTSP8 | 0.652 | 0.838 | 0.898 | 0.838 | MSTSP21 | 0.773 | 0.628 | 0.834 | 0.870 |
MSTSP9 | 0.820 | 0.975 | 1.000 | 1.000 | MSTSP22 | 0.713 | 0.409 | 0.816 | 0.874 |
MSTSP10 | 0.850 | 0.969 | 1.000 | 1.000 | MSTSP23 | 0.671 | 0.344 | 0.829 | 0.839 |
MSTSP11 | 0.758 | 0.963 | 1.000 | 0.990 | MSTSP24 | 0.724 | 0.319 | 0.811 | 0.897 |
MSTSP12 | 0.732 | 0.809 | 0.871 | 0.857 | MSTSP25 | 0.725 | 0.270 | 0.747 | 0.859 |
MSTSP13 | 0.752 | 0.792 | 0.922 | 0.983 |