Abstract: In this paper, we implement 2 ECs: a particle swarm optimizer(PSO),a Society of Hill-climbers(SoHC). Each EC is implemented to evolve a solution with a fitness above a threshold for F7 function. We use the same parameter sets to compare the performances of the two ECs, according to their success rates, average function evaluation times and average best fitness.
Key words:Evolutionary Computing; PSO Full-model;Fuction Optimization
doi:10.3969/j.issn.1673-0194.2009.15.021
CLC number: TP3Article character:AArticle ID:1673-0194(2009)15-0068-04
1 Introduction
In this paper, we implement 2 ECs: a particle swarm optimizer(PSO),a Society of Hill-climbers(SoHC).Each EC is implemented to evolve a solution with a fitness above a threshold for F7 function.
Hill-climbers are local search techniques. Therefore it is important to define the neighbor of the local search. A Society of Hill-climbers(SoHCs) as proposed by Sebag Schoenauer(1977) can be seen as being composed of a (μ+λ)-EC and a repoussoir which is an adaptive external memory structure that is used to ‘push’ the search away from known suboptimal solutions[1].
The aim of this paper is to compare the performances of the two types of algorithms, according to their success rates, average function evaluation times and average best fitness.
Section 2 describes the problem to be solved by using the two types of Evolution Computation. Section 3 introduces these algorithms and their implementation. Section 4 describes the experiment parameter setup. Section 5 brings forth the experiment results and analyze them.
2 Problem Statement
In this paper, a particle swarm optimizer (PSO) Full-model, a Society of Hill-climbers (SoHC) need to be implemented to evolve a solution with the fitness above a threshold (23.30923) for F7 function:
F7(x,y)=(x2+y2)0.25sin(50(x2+y2)0.1)2+1.0」
The problem is allowed a maximum of 4000 function evaluations, and x and y have the value in the range [-100, 100].
It is obvious that this objective functions is complex and we could not just use the technique of gradient descent or analytical discovery to find the optimum solution.
We will apply the PSO Full-model, the SoHC to solve the problem.
3 Evolutionary Computation (ECs) Implementation
3.1 The PSO Full-model (using the off-the-shelf parameter setting)
We set φ1>0 and φ2>0 to get Full Model swam type for using Particle Swarm Optimization for F7 function optimization here. And we use the off-the-shelf parameter setting as the base algorithm.
The basis steps for the PSO Full-model Algorithm are implemented with the following steps:
Step 1 Initialization: each individual has x and y values. They will be created randomly within a range, [-100, 100]. Each x has its Vx value and each y has its Vy value. The Vx and Vy values will be created as x’s or y’s initialization, a random value within the range, [-100, 100]. Generated μ individuals are evaluated by fitness function, F7.
And we initialize the individual best P value by the individual’s initialization value.
We get the global neighborhood bestPg by selecting the best individual among the population according to their fitness value.
Step 2 Swarm Moving: each time, the μ individuals will update their velocities first, and then move ahead. The formula is listed below:
When Xmin Vi=K(Vi+φ1×rand()×(Pi-Xi)+φ2×rand()×(Pg-Xi), Otherwise Vi=0. Xi+1=Xi+Vi, If Xi>Xmax, set X value as Xmax; IfXi Here K=2/|2-φ-φ2-4×φ|; φ = 4.1; φ1+φ2=φ . Then the individuals will be evaluated and update P and Pg value. Pg value is updated asynchronously, which means after each individual is generated, it will be used to update the Pg value. Step 2 will cycle to run until the best fitness gets above threshold (23.309239) or the total evaluation times equal to 4000. The whole process runs for 100 times for each parameter set. 3.2 A Society of Hill-climbers The society of Hill-climbers are being composed of a (μ+ λ)-EC and a repoussoir which is an adaptive external memory structure that is used to ‘push’ the search away from known suboptimal solutions. In this algorithm, we total have 6 parameters. They are: μ, λ, M, R, α, T. μ represents the number of Hill-climbers. λ represents the number of offspring created. M represents the number of bits to be mutated to create an offspring. R the percentage of the worst fit CSs that will be used to create the AvgWorst in order to update the repeller. α represents the decay rate for the components of R. T is the tournament size for selecting a variable (gene) of the parent to mutate. In above parameters, the only sensitive parameter for historical societies is M. The basis steps for this algorithm are: Step 1 initialize: The population size is μ. We use random method to produce the first generation. We use binary code to represent the values. And all values should be in range [-100,100]. We need initialize each component of repoussoir to 0.5. Step 2 Create-offspring: The method of create-offspring is more complexible. We will create total λ children. So every parent will create λ/μ children. We will use tournament method to select M bit to mutate the bit of parent to create children. The tournament method is described as follows: The tournament function has 3 parameters: μ, Repoussoir, and T. Randomly select T bits i1,i2, …,iT from binary code list of parent; Return ik such that |parent(ik)-Repoussoir(ik)| is minimum. Then the selected bit will be flipped from 1 to 0 or from 0 to 1. The mutation bit is M. Step 3 Parent Selection: Now we total have μ+λ individuals. We select μ best one from μ+λ to be as our next generation parents. Step 4 Update Repoussoir: Hill-climbers are provided with some memory of the past generations-repeller for next generation. We need update repoussoir when getting new generation. We select R% worst individuals from μ+λ individuals. Then calculate average worst bits(AvgWorst). This is like a array which stored each average worst bit as components. The formula Repeller = (1 – α) Repeller + α AvgWorst will be used to update new repoussoirs. About the result, you can see the Table 2. 4 Experimental Setup To compare these two EC algorithms, we use the same parameter values and at last find the parameter set for each EC algorithm to get the best result. (1) μ (Population size):{2,12,30,50,100}. (2) λ (Offspring size ):{4, 24,60,100,200}. (3) M (the number of bits to be mutated to create an offspring): {1,2, 5}. (4) R (the percentage of the worst fit CSs that will be used to create the AvgWorst in order to update the repeller.):{0.5}. (5) α (the decay rate for the components of R): {0.01,0.1,1.0} (6) T (the tournament size for selecting a variable (gene) of the parent to mutate):{2,4,6} The maximum function evaluation times for these two algorithms are 4000. For each parameter set, the algorithm will run 100 times to get the results, and calculate the success rate, average number of function evaluations, standard deviation of function evaluations, average best fitness found and standard deviation of the best fitness found. 5 Results In the results tables,they have the same parameters as the following: μ: population size φ1: cognition component coefficient φ2: social component coefficient λ : offspring size M : the number of mutated bits R : the worst fit CSs. α : the decay rate for the components of R T : the tournament size for selecting a variable of the parent to mutate S.R : Success Rate (%) A.F.E : Average number of function evaluations S. F. E : Standard deviation of function evaluations A.B.F : Average best fitness found #: means don’t care 5.1. The PSO Full-model (using the off-the-shelf parameter setting) We use the off-the-shelf parameter setting as our base algorithm. It uses global neighborhood, and asynchronously updates Pg value. To test for better result, we choose some other values besides 30 to set particle number of the swarm, and we use other 1 and 2 values besides 2.8 and 1.3 as in the experiment setting: μ = {2, 12, 30, 50, 100} φ= 4.1 φ1 = {1.00, 1.30, 2.05, 2.80} φ2 = φ - φ1 Table 1 lists the result of using X value to determine V value as in the off-the-shelf setting: WhenXmin Vi=K×(Vi+φ1×rand()×(Pi-Xi)+φ2×rand()×(Pg-Xi); Otherwise, Vi=0. Table 2 lists the result of NOT using X value to determine V value: Vi=K×(Vi+φ1×rand()×(Pi-Xi)+φ2×rand()×(Pg-Xi) not matterXmin We can see from the results that not using X value to determine V value will get us better result as in Table 2 for F7 function optimization. From Table 1, we can see when using X value to determine V value, the best success rate is 77% and the parameter values are as in the off-the-shelf parameter setting. From particle number in Table 1, we can see when the particle number is 50 or 30, the success rate result is better than the particle number is 12 or 100. Using 2 as the particle number will get us the worst result. And from Table 1, when using X value to determine V value, generally 2.8 used as φ1 value will give us better result than other values we tried to use for φ1. From Table 2, we can see when not using X value to determine V value, the best success rate is 100% and the average function evaluation times is 442.1 when the parameter values are μ=12, φ1=2.05, and φ2=2.05. From particle number in Table 1, we can see when the particle number changes from 12 to 30 to 50 then 100 and at last 2, the result becomes worse. The same as Table 1, using 2 as the particle number will get us the worst result. And from Table 2, we can see the same as Table 1, generally 2.8 used as φ1 value will give us better result than other values we tried to use for φ1, except for when particle number is 12, 2.05 as φ1 value will give us some better result than 2.8. 5.2 A Society of Hill-climbers Society of Hill-climbers(SoHCs) can be seen as being composed of a (μ+λ)-EC and a repoussoir adaptive external memory structure that is used to ‘push’ the search away from known suboptimal solutions. In this algorithm,we total set 6 parameters to check the performance of algorithm. They are: < μ, λ, M, R, α, T >. The meaning of these parameters has been introduced in “3.2 A Society of Hill-climbers” of this paper. The six parameter values we set are: μ = {2,12,30,50,100}; λ= {4, 24,60,100,200}; M = {1, 2, 5}; R = {0.5}; α = {0.01,0.1,1.0}; T = {2, 4, 6}; Table 1 PSO Full-model (V value determined by X value) Table 2 PSO Full-model (V value Not determined by X value) Bigger population size will make the diversity of generation bigger. In our group, we set population size μ={2,12,30,50,100}. In Table 3.When μ equal 50 or 100, the highest success rate can be 58% . Parameter M has important role. See Table 3. When M equal 1 or 2, it will get better performance than M equal 5. If we only compare the situation when M equal 1 and M equal 2, we can see that if population size is 2,12,30, the performance of M equal 2 is a little bit better than the performance of M equal 1. When population size is above 30,we can see in Table 3 that the performance of M equal 1 is a little bit better than the performance of M equal 2. Parameter α in this algorithm does not make big effect on success rate. We think if we add our bit number bigger, that will make effect. Parameter T is the number of bits selected for mutation. The larger T can provide the better chance that created 2 offspring in one parent is different each other. But in this algorithm, T has no big effect on success rate. In all of values, when population size μ is 30, λ is 60, M is 1, R is 0.5, α is 1,T is 2. the success rate will reach to highest 59%. And parameter σ has more important role than population size μ to get solution. Current test is based on Bit number (chromsome) is 10. After we test some other values, we found if set bit number to 5, we will get much better results. Almost all of parameters can arrive to 100% success rate. See Table 4, that shows the result. 6 Discussions and summary Table 5 is the two ECs’s performance comparison,and we could find that PSO full model are much effective in the success rate ,yet the two ECs have the same average best fitness. Hence PSO has better performance in this solution. Table 3 Result SoHC with 10 Bit number Table 5 comparison of the two Ecs Types of Ecsμ12λMRTS.RAEESEEABFS.BF PSO Full-model122121#####1004 42113 26523.309 20.000 002 7 SoHC30##601112592 6281 348.123.309 20.000 098 Reference [1] M Sebag, M Schoenauer. A Society of Hill-climbers[C]// Proceedings of the Fourth IEEE International Conference on Evolutionary Computation, 1997: 319-324.