999精品在线视频,手机成人午夜在线视频,久久不卡国产精品无码,中日无码在线观看,成人av手机在线观看,日韩精品亚洲一区中文字幕,亚洲av无码人妻,四虎国产在线观看 ?

A Study to Improve the Method of Ascertaining Attribute Weight Based on Rough Sets Theory

2009-04-29 00:00:00LIUChengZHANGJian-binBAOXin-zhong
中國(guó)管理信息化 2009年15期

Abstract: There are some shortages to ascertain attribute weight based on rough set in current studies. In this paper, attribute importance represented by rough set is studied deeply. Aiming at the existing problems, algebra presentation of rough sets is proved to be more comprehensive than its information presentation, then a new method of ascertaining attribute weigh is put forward based on rough set conditional entropy. Finally, it is shown that the new method is more reasonable than the old one by an example.

Key words: Weight; Rough Sets; Attribute Significance Degree; Decision Table

doi:10.3969/j.issn.1673-0194.2009.15.034

CLC number: TP224.0Article character:AArticle ID:1673-0194(2009)15-0112-03

1 PREFACES

Weight, which reflects positions and functions of various elements in the process of judging and decision-making, is very crucial and its accuracy directly affects final results. There are several common methods to ascertain attribute weight, such as experts scoring, the fuzzy statistics and the sort contrast dualism. But in these common methods, ascertaining attribute weight is affected excessively on experts experience and knowledge, so that they sometimes are not able to reflect the actual situation objectively. However, rough sets theory fully reflects the objectivity of the data, without offering any more prior information except the data set to be dealt with. Therefore, some experts have researched on the method of ascertaining attribute weight based on rough sets theory.

Rough sets theory [1] is a method about expressing, studying and generalizing to study incomplete and uncertain knowledge and data. This was first put forward by professor Pawlak from Warsaw University of Technology in Poland in the early 1980’s. Rough sets has been successfully applied in many areas, such as expert systems, machine learning and pattern recognition, all because of its character——no need of prior information [2-6]. The method to ascertain condition attribute weight in the decision table has been introduced in the document [7], and it has been quoted in different areas by many scholars. This paper analyzes the shortages of the method in the document [7], gives a new method to ascertain attribute weight based on rough set, and proves the new method reasonable.

2 ANALYSIS ON THE ORIGINAL METHOD OF ASCERTAINING ATTRIBUTE WEIGHT BASED ON ROUGH SETS THEORY

The importance of various attributes (indicators) needs to be ascertained because they are not in the same. In the rough sets theory, we should take away an attribute first and then consider the change in classification without the attribute. If the classification changes too much after removing the attribute, it is of bigger intensity and greater importance. Otherwise, it is of smaller intensity and lower importance [4-7]. According to this characteristic, the literature [7] defines the importance of the attributes (indicators) as followed.

Definition 1[1]: In the decision table S=(U,C,D,V,f), the dependence gB(D) of decision attribute D to conditional attribute sets (indicators) BC is defined as follows:

gB(D)=|POSB(D)|/|U|

Definition 2[7]: (Algebra presentation definition of attribute) In the decision table S=(U,C,D,V,f), the significance degree of conditional attribute (indicator) c is defined as follows:

Sig(c)=gc(D)-gc-{c}(D).

The weight W0(c) of conditional attribute (indicator) c is defined as:

W0(c)=Sig(c)∑a∈CSig(a)

Note: The definition illustrates that the greater Sig(c) is, the more important the conditional attribute (indicator) c is, so as to the weight of the attribute.

In order to explain the shortage of the definition better, take an example as follows.

CHINA MANAGEMENT INFORMATIONIZATION /

/ CHINA MANAGEMENT INFORMATIONIZATION

U/{a,b,c}={{x1,x2},{x3},{x4},{x5},{x6,x7},{x8,x9}, {x10,x11}}.

U/D={{x1,x5,x6,x8,x11},{x2,x3,x4,x7,x9,x10}}.

Therefore:POS(a,b,c)(D)= {x3,x4,x5}; POS(a,b)(D)= {x3,x4,x5}; POS(b,c)(D)= {x3,x4,x5}; POS(a,c)(D)= {x3,x4,x5}. As a result:

ga,b,c(D)=|POS{a,b,c}(D)|/|U|=3/11.

Similar that:ga,b,c(D)=g{a,c}(D)=g{b,c}(D)=3/11.

Then:Sig(a)=g{a,b,c}(D)-g{b,c}(D)=0;Sig(b)=g{a,b,c}(D)-g{a,c}(D)=0; Sig(c)=g{a,b,c}(D)-g{a,b}(D)=0.

From the example above, we can see that the weight of conditional attributes can not be calculated accurately in this case. The conditional attributes of a, b, c are unnecessary in the decision Table 1 in respect to simple attribute. The literature[8] simplified the decision table and figured out the importance of the attributes for this problem. At this time, the importance of at least one attributes is not 0, and then calculated it with the formula of ascertaining weight in literature[7]. Although it can guarantee at least one weight not to be 0, it ignores the practical significance of the attributes form the 0 weight and the reduction weight. In order to solve this problem, we offer a new method to ascertain weight.

3A METHOD TO ASCERTAIN WEIGHT BASED ON CONDITION INFORMATION ENTROPY

3.1 Condition information entropy

The following definition establishes the relationship between knowledge of rough sets theory and information entropy, so that we can express the main concepts and operations of rough sets theory from the information point of view, which is usually called information presentation of the rough sets theory.

Definition 3: In the decision table S=(U,C,D,V,f), any attribution set SC∪D in U is a random variable defined in the algebra subset of U. The probability distributing of the set could be confirmed as followed:

[S∶p]SiS2…Si

p(Si) p(S2) … p(Si)

In this formula, p(Sj)=|Sj|/|U|,j=1,2,…,t.

Definition 4: In the decision table S=(U,C,D,V,f),compared to the conditional attribute sets C(U/C={C1,C2,…,Cm}), the conditional entropy H1(D|C) of decision attribute sets D(U/D={D1,D2,…,Dk}) is defined as followed:

H(D/C)=-∑mi=1p(Ci)∑kj=1p(Dj|Ci)logp(Dj|Ci)

In this formula, p(Dj|Ci)=|Dj∩Ci|/|Ci|(i=1,2,…,n;j=1,2,…,n).

3.2 Attribute important degree of condition information entropy

Definition 5: (information presentation definition of attribute) In the decision table S=(U,C,D,V,f), , the significance degree of conditional attribute (indicator) c is defined as follows:

Sig(c)=I(D|(C-{c}))-I(D|C)

The greater the value of Sig, on the known of C conditions, the more important the c to the decision D. Compared to the importance of attributes in the form of algebra, while its definition considered the attribute impact on the subset determined by discussion domain, information definition considered uncertain classified subset. It means that, though the important degree of attribute is 0 under the definition of algebra, it is not necessarily 0 under the definition of information, but if it is 0 under the definition of information, it is definitely 0 under the definition of algebra[9].

3.3 The method to ascertain weight based on condition information entropy

Definition 6: In the decision table S=(U,C,D,V,f), , the significance degree of conditional attribute (indicator) c is defined as follows:

NewSig(c)=I(D|(C-{c}))-I(D|C)

w(b)=SigNew(b)∑a∈CSigNew(a)

4 EN EXAMPLE

Now, the new method to ascertain weigh is used to calculate the weights of several attributes in the decision table 1.

5 CONCLUSION

We analyze the existing method to ascertain the weight based on rough set theory, aiming at its shortage, work out a new way based on information entropy with the character that the rough set’s information presentation is more comprehensive than the algebra presentation, and verify it with an example. The new method of ascertaining weight is more comprehensive, rational, universal and reasonable than the existing method.

References

[1] Pawlak Z. Rough Sets: Probability Versus Deterministic Approach[J]. International Journal of Man-Machine Studies, 1988, 29(1): 81-95.

[2] Hu X H,Cercone N. Learning in Relational Data Bases: A Rough Set Approach[J]. Computational Inteligence, 1995, 11( 2) : 323-338.

[3] Swiniarski S, Hargis L. Rough Set as a Front End of Neural-Networks Texture Classifiers[J]. Neuro -computing, 2001, 36(1): 85-102.

[4] Zhang W X, Wu W Z, Liang J Y. The Theory and Method of Rough Sets[M]. Beijing: Science Press, 2000.

[5] Zhang W X, Chou G F. Uncertain Decision Based on Rough Sets[M]. Beijing: Tsinghua University Press, 2005.

[6] Miao D Q, Fan S D. The Calculation of Knowledge Granulation and Its Application[J]. Systems Engineering: Theory Practice, 2002(1):48-56.

[7] Cao X Y, Liang J G. The Method of Ascertaining Weight Based on Rough Sets Theory[J]. Chinese Journal of Management Science, 2002,10(5): 98-100.

[8] Zhou A F, Chen Z Y. How to Choose the SC Partner Based on Tough Set[J]. Logistics Technology, 2007, 26(8): 178-181.

[9] Wang G Y, Yu H, Yang D C. Decision Table Reduction based on Conditional Information Entropy[J]. Chinese Journal of Computers, 2002, 25(7):759-766.

主站蜘蛛池模板: 国产香蕉97碰碰视频VA碰碰看| 国产亚洲欧美另类一区二区| 亚洲欧美精品在线| 91破解版在线亚洲| 亚洲三级色| 99伊人精品| 色综合综合网| 99热最新在线| 伊人狠狠丁香婷婷综合色| 日本妇乱子伦视频| 欧美在线视频不卡第一页| 秋霞国产在线| 亚洲中文字幕23页在线| 国内精品小视频在线| 欧美啪啪一区| 欧美在线视频不卡第一页| 国产在线观看人成激情视频| 欧美日韩精品一区二区视频| 欧美性猛交一区二区三区| 一级爱做片免费观看久久| av一区二区三区在线观看| 伊人福利视频| 精品国产成人国产在线| 成人精品视频一区二区在线| 在线播放精品一区二区啪视频| 久久国产精品国产自线拍| 五月婷婷中文字幕| 国产二级毛片| 色亚洲成人| 最新痴汉在线无码AV| 亚洲国产中文综合专区在| 波多野结衣一区二区三区88| 538精品在线观看| 无码区日韩专区免费系列| 国产精品密蕾丝视频| 日韩av无码DVD| 欧美亚洲国产精品久久蜜芽| 国产成人综合在线观看| 美女视频黄又黄又免费高清| 欧美亚洲网| 国产欧美又粗又猛又爽老| 亚洲欧美色中文字幕| 成人国产精品2021| 成年人国产网站| 久久精品无码国产一区二区三区| 亚洲午夜18| 97av视频在线观看| 福利国产微拍广场一区视频在线| 久久久精品无码一区二区三区| 国产极品粉嫩小泬免费看| 国产特一级毛片| 亚洲日韩精品伊甸| 99性视频| 国产无码制服丝袜| 国产凹凸一区在线观看视频| 国产va在线观看| 在线无码九区| 久久精品视频亚洲| 色婷婷成人| 2018日日摸夜夜添狠狠躁| aaa国产一级毛片| 色综合中文| 在线欧美a| 国产永久免费视频m3u8| 国产精品黄色片| 丁香婷婷久久| 久热这里只有精品6| 国产一级α片| 国产原创第一页在线观看| 日韩欧美综合在线制服| 一本无码在线观看| 国产亚洲精品自在线| 国产亚洲精品无码专| 国产在线精品美女观看| 91小视频在线观看| 人妻无码一区二区视频| 国产女人爽到高潮的免费视频| 国产精品成人一区二区不卡| 日韩精品毛片| 国产精品自在拍首页视频8 | 天天摸天天操免费播放小视频| 午夜三级在线|