999精品在线视频,手机成人午夜在线视频,久久不卡国产精品无码,中日无码在线观看,成人av手机在线观看,日韩精品亚洲一区中文字幕,亚洲av无码人妻,四虎国产在线观看 ?

An Exploration of Two Evolutionary Computing for Function Optimization

2009-04-29 00:00:00LUANShao-junGAOXue-dong
中國管理信息化 2009年15期

Abstract: In this paper, we implement 2 ECs: a particle swarm optimizer(PSO),a Society of Hill-climbers(SoHC). Each EC is implemented to evolve a solution with a fitness above a threshold for F7 function. We use the same parameter sets to compare the performances of the two ECs, according to their success rates, average function evaluation times and average best fitness.

Key words:Evolutionary Computing; PSO Full-model;Fuction Optimization

doi:10.3969/j.issn.1673-0194.2009.15.021

CLC number: TP3Article character:AArticle ID:1673-0194(2009)15-0068-04

1 Introduction

In this paper, we implement 2 ECs: a particle swarm optimizer(PSO),a Society of Hill-climbers(SoHC).Each EC is implemented to evolve a solution with a fitness above a threshold for F7 function.

Hill-climbers are local search techniques. Therefore it is important to define the neighbor of the local search. A Society of Hill-climbers(SoHCs) as proposed by Sebag Schoenauer(1977) can be seen as being composed of a (μ+λ)-EC and a repoussoir which is an adaptive external memory structure that is used to ‘push’ the search away from known suboptimal solutions[1].

The aim of this paper is to compare the performances of the two types of algorithms, according to their success rates, average function evaluation times and average best fitness.

Section 2 describes the problem to be solved by using the two types of Evolution Computation. Section 3 introduces these algorithms and their implementation. Section 4 describes the experiment parameter setup. Section 5 brings forth the experiment results and analyze them.

2 Problem Statement

In this paper, a particle swarm optimizer (PSO) Full-model, a Society of Hill-climbers (SoHC) need to be implemented to evolve a solution with the fitness above a threshold (23.30923) for F7 function:

F7(x,y)=(x2+y2)0.25sin(50(x2+y2)0.1)2+1.0」

The problem is allowed a maximum of 4000 function evaluations, and x and y have the value in the range [-100, 100].

It is obvious that this objective functions is complex and we could not just use the technique of gradient descent or analytical discovery to find the optimum solution.

We will apply the PSO Full-model, the SoHC to solve the problem.

3 Evolutionary Computation (ECs) Implementation

3.1 The PSO Full-model (using the off-the-shelf parameter setting)

We set φ1>0 and φ2>0 to get Full Model swam type for using Particle Swarm Optimization for F7 function optimization here. And we use the off-the-shelf parameter setting as the base algorithm.

The basis steps for the PSO Full-model Algorithm are implemented with the following steps:

Step 1 Initialization: each individual has x and y values. They will be created randomly within a range, [-100, 100]. Each x has its Vx value and each y has its Vy value. The Vx and Vy values will be created as x’s or y’s initialization, a random value within the range, [-100, 100]. Generated μ individuals are evaluated by fitness function, F7.

And we initialize the individual best P value by the individual’s initialization value.

We get the global neighborhood bestPg by selecting the best individual among the population according to their fitness value.

Step 2 Swarm Moving: each time, the μ individuals will update their velocities first, and then move ahead. The formula is listed below:

When Xmin

Vi=K(Vi+φ1×rand()×(Pi-Xi)+φ2×rand()×(Pg-Xi),

Otherwise

Vi=0.

Xi+1=Xi+Vi,

If Xi>Xmax, set X value as Xmax;

IfXi

Here K=2/|2-φ-φ2-4×φ|; φ = 4.1;

φ1+φ2=φ .

Then the individuals will be evaluated and update P and Pg value. Pg value is updated asynchronously, which means after each individual is generated, it will be used to update the Pg value.

Step 2 will cycle to run until the best fitness gets above threshold (23.309239) or the total evaluation times equal to 4000.

The whole process runs for 100 times for each parameter set.

3.2 A Society of Hill-climbers

The society of Hill-climbers are being composed of a (μ+ λ)-EC and a repoussoir which is an adaptive external memory structure that is used to ‘push’ the search away from known suboptimal solutions. In this algorithm, we total have 6 parameters. They are: μ, λ, M, R, α, T.

μ represents the number of Hill-climbers.

λ represents the number of offspring created.

M represents the number of bits to be mutated to create an offspring.

R the percentage of the worst fit CSs that will be used to create the AvgWorst in order to update the repeller.

α represents the decay rate for the components of R.

T is the tournament size for selecting a variable (gene) of the parent to mutate.

In above parameters, the only sensitive parameter for historical societies is M.

The basis steps for this algorithm are:

Step 1 initialize: The population size is μ. We use random method to produce the first generation. We use binary code to represent the values. And all values should be in range [-100,100]. We need initialize each component of repoussoir to 0.5.

Step 2 Create-offspring: The method of create-offspring is more complexible. We will create total λ children. So every parent will create λ/μ children. We will use tournament method to select M bit to mutate the bit of parent to create children. The tournament method is described as follows:

The tournament function has 3 parameters: μ, Repoussoir, and T. Randomly select T bits i1,i2, …,iT from binary code list of parent; Return ik such that |parent(ik)-Repoussoir(ik)| is minimum. Then the selected bit will be flipped from 1 to 0 or from 0 to 1. The mutation bit is M.

Step 3 Parent Selection: Now we total have μ+λ individuals. We select μ best one from μ+λ to be as our next generation parents.

Step 4 Update Repoussoir: Hill-climbers are provided with some memory of the past generations-repeller for next generation. We need update repoussoir when getting new generation. We select R% worst individuals from μ+λ individuals. Then calculate average worst bits(AvgWorst). This is like a array which stored each average worst bit as components. The formula Repeller = (1 – α) Repeller + α AvgWorst will be used to update new repoussoirs.

About the result, you can see the Table 2.

4 Experimental Setup

To compare these two EC algorithms, we use the same parameter values and at last find the parameter set for each EC algorithm to get the best result.

(1) μ (Population size):{2,12,30,50,100}.

(2) λ (Offspring size ):{4, 24,60,100,200}.

(3) M (the number of bits to be mutated to create an offspring): {1,2, 5}.

(4) R (the percentage of the worst fit CSs that will be used to create the AvgWorst in order to update the repeller.):{0.5}.

(5) α (the decay rate for the components of R):

{0.01,0.1,1.0}

(6) T (the tournament size for selecting a variable (gene) of the parent to mutate):{2,4,6}

The maximum function evaluation times for these two algorithms are 4000. For each parameter set, the algorithm will run 100 times to get the results, and calculate the success rate, average number of function evaluations, standard deviation of function evaluations, average best fitness found and standard deviation of the best fitness found.

5 Results

In the results tables,they have the same parameters as the following:

μ: population size

φ1: cognition component coefficient

φ2: social component coefficient

λ : offspring size

M : the number of mutated bits

R : the worst fit CSs.

α : the decay rate for the components of R

T : the tournament size for selecting a variable of the parent to mutate

S.R : Success Rate (%)

A.F.E : Average number of function evaluations

S. F. E : Standard deviation of function evaluations

A.B.F : Average best fitness found

#: means don’t care

5.1. The PSO Full-model (using the off-the-shelf parameter setting)

We use the off-the-shelf parameter setting as our base algorithm. It uses global neighborhood, and asynchronously updates Pg value. To test for better result, we choose some other values besides 30 to set particle number of the swarm, and we use other 1 and 2 values besides 2.8 and 1.3 as in the experiment setting:

μ = {2, 12, 30, 50, 100}

φ= 4.1

φ1 = {1.00, 1.30, 2.05, 2.80}

φ2 = φ - φ1

Table 1 lists the result of using X value to determine V value as in the off-the-shelf setting:

WhenXmin

Vi=K×(Vi+φ1×rand()×(Pi-Xi)+φ2×rand()×(Pg-Xi);

Otherwise, Vi=0.

Table 2 lists the result of NOT using X value to determine V value:

Vi=K×(Vi+φ1×rand()×(Pi-Xi)+φ2×rand()×(Pg-Xi) not matterXmin

We can see from the results that not using X value to determine V value will get us better result as in Table 2 for F7 function optimization.

From Table 1, we can see when using X value to determine V value, the best success rate is 77% and the parameter values are as in the off-the-shelf parameter setting.

From particle number in Table 1, we can see when the particle number is 50 or 30, the success rate result is better than the particle number is 12 or 100. Using 2 as the particle number will get us the worst result.

And from Table 1, when using X value to determine V value, generally 2.8 used as φ1 value will give us better result than other values we tried to use for φ1.

From Table 2, we can see when not using X value to determine V value, the best success rate is 100% and the average function evaluation times is 442.1 when the parameter values are μ=12, φ1=2.05, and φ2=2.05.

From particle number in Table 1, we can see when the particle number changes from 12 to 30 to 50 then 100 and at last 2, the result becomes worse. The same as Table 1, using 2 as the particle number will get us the worst result.

And from Table 2, we can see the same as Table 1, generally 2.8 used as φ1 value will give us better result than other values we tried to use for φ1, except for when particle number is 12, 2.05 as φ1 value will give us some better result than 2.8.

5.2 A Society of Hill-climbers

Society of Hill-climbers(SoHCs) can be seen as being composed of a (μ+λ)-EC and a repoussoir adaptive external memory structure that is used to ‘push’ the search away from known suboptimal solutions. In this algorithm,we total set 6 parameters to check the performance of algorithm. They are: < μ, λ, M, R, α, T >. The meaning of these parameters has been introduced in “3.2 A Society of Hill-climbers” of this paper.

The six parameter values we set are:

μ = {2,12,30,50,100};

λ= {4, 24,60,100,200};

M = {1, 2, 5};

R = {0.5};

α = {0.01,0.1,1.0};

T = {2, 4, 6};

Table 1 PSO Full-model

(V value determined by X value)

Table 2 PSO Full-model

(V value Not determined by X value)

 Bigger population size will make the diversity of generation bigger. In our group, we set population size μ={2,12,30,50,100}. In Table 3.When μ equal 50 or 100, the highest success rate can be 58% .

 Parameter M has important role. See Table 3. When M equal 1 or 2, it will get better performance than M equal 5. If we only compare the situation when M equal 1 and M equal 2, we can see that if population size is 2,12,30, the performance of M equal 2 is a little bit better than the performance of M equal 1. When population size is above 30,we can see in Table 3 that the performance of M equal 1 is a little bit better than the performance of M equal 2.

 Parameter α in this algorithm does not make big effect on success rate. We think if we add our bit number bigger, that will make effect.

 Parameter T is the number of bits selected for mutation. The larger T can provide the better chance that created 2 offspring in one parent is different each other. But in this algorithm, T has no big effect on success rate.

 In all of values, when population size μ is 30, λ is 60, M is 1, R is 0.5, α is 1,T is 2. the success rate will reach to highest 59%. And parameter σ has more important role than population size μ to get solution.

Current test is based on Bit number (chromsome) is 10. After we test some other values, we found if set bit number to 5, we will get much better results. Almost all of parameters can arrive to 100% success rate. See Table 4, that shows the result.

6 Discussions and summary

Table 5 is the two ECs’s performance comparison,and we could find that PSO full model are much effective in the success rate ,yet the two ECs have the same average best fitness. Hence PSO has better performance in this solution.

Table 3 Result SoHC with 10 Bit number

Table 5 comparison of the two Ecs

Types of Ecsμ12λMRTS.RAEESEEABFS.BF

PSO Full-model122121#####1004 42113 26523.309 20.000 002 7

SoHC30##601112592 6281 348.123.309 20.000 098

Reference

[1] M Sebag, M Schoenauer. A Society of Hill-climbers[C]// Proceedings of the Fourth IEEE International Conference on Evolutionary Computation, 1997: 319-324.

主站蜘蛛池模板: 狠狠v日韩v欧美v| 亚洲区第一页| 亚洲成a人片| 国产精品人人做人人爽人人添| 国产在线观看人成激情视频| 国产激情影院| 制服丝袜国产精品| 国产一区二区三区精品欧美日韩| 日本高清视频在线www色| 亚洲色中色| 毛片手机在线看| 97se亚洲综合在线| 亚洲视频免费在线看| 东京热一区二区三区无码视频| 国产中文一区a级毛片视频| 91精品国产丝袜| 欧美日韩第三页| 91国语视频| 国产一区三区二区中文在线| 国产精品成人观看视频国产| 97在线免费视频| 欧美视频在线播放观看免费福利资源| 毛片免费在线视频| 亚洲精品麻豆| 波多野结衣在线一区二区| 波多野吉衣一区二区三区av| 色九九视频| 午夜无码一区二区三区在线app| 欧美激情成人网| 欧美日韩福利| 精品无码一区二区三区电影| 色老二精品视频在线观看| 亚洲一区第一页| 日韩第九页| 欧美成人午夜视频| 国产成人综合欧美精品久久| 在线亚洲精品福利网址导航| 91青草视频| 久久亚洲欧美综合| 亚洲无线国产观看| 欧美一级夜夜爽| 色偷偷综合网| www.国产福利| 国产成人亚洲无码淙合青草| 国产成在线观看免费视频| 欧美中日韩在线| 亚洲va视频| 亚洲三级视频在线观看| 中文无码日韩精品| 国内精品自在欧美一区| 亚洲欧美一级一级a| 91探花国产综合在线精品| 狠狠干综合| 丁香亚洲综合五月天婷婷| 伊人天堂网| 九九热精品免费视频| 广东一级毛片| 精品久久久久久成人AV| 99久久这里只精品麻豆| 色天堂无毒不卡| 日本免费一级视频| 青青操国产| 国产精品主播| 国产成人精品无码一区二| 东京热高清无码精品| 久久青草精品一区二区三区 | 熟妇无码人妻| 天天综合网在线| 国产麻豆精品在线观看| 美女裸体18禁网站| 中文字幕首页系列人妻| 亚洲精品中文字幕无乱码| 青青青国产免费线在| a毛片基地免费大全| 狼友av永久网站免费观看| 亚洲成a人片在线观看88| 成年网址网站在线观看| 亚洲精品视频网| 蜜桃视频一区二区三区| 91精品国产91久无码网站| 亚洲色无码专线精品观看| 久久先锋资源|