bicriteria variable selection in a fuzzy regression equation

7
@ PERGAMON An Intemation-I Joumal computers & mathematics with applications Computers and Mathematics with Applications 40 (2000) 877-883 www.elsevier.nl/locate/camwa Bicriteria Variable Selection in a Fuzzy Regression Equation HSIAO-FAN WANG AND RUEY-CHYN TSAUR Department of Industry Engineering, National Tsing Hua University Hsinchu, Taiwan, 30043, R.O.C. (Received and accepted March 1999) Abstract--By considering two criteria of minimum total sum of vagueness and minimum total sum of squares in estimation, this article proposes a variable selection method for a fuzzy regression equation with crisp-input and fuzzy-output. A branch-and-bound algorithm is designed and "the least resistance principle" is adopted to determine the set of compromised solutions. Numerical examples are provided for illustration. (~) 2000 Elsevier Science Ltd. All rights reserved. K e y w o r d s - - F u z z y regression, Tanaka's model, Bicriteria, Branch-and-bound algorithm, Crisp- input variables, Fuzzy-output variable. 1. INTRODUCTION Statistical regression has been developed for many years and also has been applied to many fields such as business administration, economics, and engineering. For such kinds of applications, a large number of collected data or a valid distribution form for the collected data is required. How- ever, once the collected data is not sufficient or the relation between variables is imprecise, or the collected data are fuzzy data, statistical regression is difficult to use to treat such problems [1,2]. Fuzzy linear regression [2] provides means for treating problems which fail to satisfy one or some of the above-mentioned characteristics. In fuzzy linear regression, two kinds of models have been proposed by Tanaka [2]. One is with crisp data but having fuzzy relations, and the other is with crisp-input and fuzzy-output linear regression. Both of them have been discussed for many years, and also have been applied to many areas such as forecasting [3,4] and engineering [5-7]. Although many issues regarding fuzzy regression have been resolved [2,8-10], the problem of variable selection in a fuzzy regression equation has been neglected. In a fuzzy regression equation, it is a common practice to consider more than one independent input variable. Therefore, an analyst often faces a dilemma of (1) requiring a fuzzy regression equation to contain as many independent variables as possible, so that the "information content" in these factors can provide a better prediction; and (2) cutting down the number of variables so that the costs of data collection, model mainte- nance and calculation time can be reduced. Authors acknowledge the financial support from the National Science Council, R.O.C., with project number NSC 86-2213-E007-020. 0898-1221/00/$ - see front matter (~) 2000 Elsevier Science Ltd. All rights reserved. Typeset by Jt.A4S-TEX PII: S0898-1221(00)00203-0

Upload: hsiao-fan-wang

Post on 04-Jul-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Bicriteria variable selection in a fuzzy regression equation

@ P E R G A M O N

An Intemation-I Joumal

computers & mathematics with applications

Computers and Mathematics with Applications 40 (2000) 877-883 www.elsevier.nl/locate/camwa

Bicriteria Variable Selection in a Fuzzy Regression Equation

H S I A O - F A N W A N G AND R U E Y - C H Y N T S A U R Department of Industry Engineering, National Tsing Hua University

Hsinchu, Taiwan, 30043, R.O.C.

(Received and accepted March 1999)

A b s t r a c t - - B y considering two criteria of minimum total sum of vagueness and minimum total sum of squares in estimation, this article proposes a variable selection method for a fuzzy regression equation with crisp-input and fuzzy-output. A branch-and-bound algorithm is designed and "the least resistance principle" is adopted to determine the set of compromised solutions. Numerical examples are provided for illustration. (~) 2000 Elsevier Science Ltd. All rights reserved.

K e y w o r d s - - F u z z y regression, Tanaka's model, Bicriteria, Branch-and-bound algorithm, Crisp- input variables, Fuzzy-output variable.

1. I N T R O D U C T I O N

Statistical regression has been developed for many years and also has been applied to many fields such as business administration, economics, and engineering. For such kinds of applications, a

large number of collected da ta or a valid distribution form for the collected data is required. How- ever, once the collected data is not sufficient or the relation between variables is imprecise, or the collected da ta are fuzzy data, statistical regression is difficult to use to t reat such problems [1,2].

Fuzzy linear regression [2] provides means for treating problems which fail to satisfy one or

some of the above-mentioned characteristics. In fuzzy linear regression, two kinds of models have

been proposed by Tanaka [2]. One is with crisp da ta but having fuzzy relations, and the other is

with crisp-input and fuzzy-output linear regression. Both of them have been discussed for many years, and also have been applied to many areas such as forecasting [3,4] and engineering [5-7]. Although many issues regarding fuzzy regression have been resolved [2,8-10], the problem of

variable selection in a fuzzy regression equation has been neglected. In a fuzzy regression equation, it is a common practice to consider more than one independent input variable. Therefore, an analyst often faces a di lemma of

(1) requiring a fuzzy regression equation to contain as many independent variables as possible, so tha t the "information content" in these factors can provide a bet ter prediction; and

(2) cutt ing down the number of variables so tha t the costs of da ta collection, model mainte- nance and calculation t ime can be reduced.

Authors acknowledge the financial support from the National Science Council, R.O.C., with project number NSC 86-2213-E007-020.

0898-1221/00/$ - see front matter (~) 2000 Elsevier Science Ltd. All rights reserved. Typeset by Jt.A4S-TEX PII: S0898-1221 (00)00203-0

Page 2: Bicriteria variable selection in a fuzzy regression equation

878 H.-F. WANG AND R.-C. TSAUR

For a fuzzy regression equation with crisp data, Wang and Tsaur [8] have derived a relation that the total sum of squares in the fuzzy regression interval equals the total sum of squares in vagueness plus the total sum of squares in estimation, and defined a partial value of IC to be a criterion of variable selection. However, once the collected data contain fuzzy-output, because each datum is a set, the equality relation established among the total sum of squares in a fuzzy regression interval, total sum of squares in vagueness, and total sum of squares in estimation cannot be exactly defined. However, this concept of "the more the reduction of the total sum of squares in vagueness and total sum of squares in estimation, the better is the interpretation ability of the variable" can be a rule of thumb in selecting an input variable into a fuzzy regression equation. Therefore, in this study we shall adopt the minimum total sum of vagueness and the minimum total sum of squares in estimation as two criteria to select input variables in a crisp- input and fuzzy-output equation problem. By the developed procedure, it is expected that the vagueness of the fuzzy regression equation and the error of estimation will be reduced.

In order to derive these two criteria, Section 2 will first introduce the method proposed by Wang and Tsaur [8]. Then, in Section 3, after a fuzzy regression model proposed in [8] with crisp-input and fuzzy-output is introduced, a selection procedure based on the branch-and-bound algorithm is proposed. Based on two criteria of selection, a set of compromised solutions will be derived. An example is illustrated in Section 4. Finally, conclusions are drawn in Section 5.

2. A VARIABLE SELECTION M E T H O D FOR DATA T Y P E OF C R I S P - I N P U T A N D C R I S P - O U T P U T

For a fuzzy linear regression with crisp-input and crisp-output problem, the regression equation can be presented by

~--- 4 0 -~- 2~1X 1 - ~ - . - -~ 2~jXj -~-... ~- ANXN ~- AX, (1)

where X = [X0, X 1 , . . . , X j , . . . , XN]' is a vector of input variables with X0 = 1; A = [A0, -41, . . . , Aj , . - . , -~N] is a vector of fuzzy coefficients presented in the form of symmetric triangular fuzzy numbers, and the M collected input-output data are defined as [Yi,Xi] = [Y/,Xi0 . . . . , X i j , • . . , ) ( iN] , i = 1 , 2 , . . . , M , and j = 0, 1 , . . . , N of N independent variables. By considering a symmetric triangular membership function for fuzzy relation Aj, Tanaka [2] has proposed the following model to solve the fuzzy regression equation in (1), so that the total degree of fuzziness embedded in relations can be minimized:

Min ~" cj IXijl j=O

N N

s.t. j~=o%X~j + (l - h) ~ _ c j lX~jl >_ Y~, Vi = I ,2, . . . , M , j=o (2)

N N

- Z + (1 - h) Z cj IX jl __ vi = 1, 2 , . . . , M, j=0 j=o

cj >_0, ~j ER , X ~ 0 = l , 0 < h < l , V i = 1 , 2 . . . . ,M, j = 0 , 1 , . . . , N .

This model provides a fuzzy interval constructed by each estimated fuzzy output ~ = (I//L, y h = l , y v ) , i = 1,2 , . . . ,M, where Yi L = ( ~ - c)TZi is the lower bound of ~ ; Yi h=l = c~7-Xi is the center value of ~ , and YU = (v~ + c)l-Xi is the upper bound of ~ with c = [co, c l , . . . , CN],

= [c~0, ~ 1 , . . . , (~g]. By considering M M

i=1 i=1 M M M

= + Z + (3) i=1 i=1 i=1

Page 3: Bicriteria variable selection in a fuzzy regression equation

Bicriteria Variable Selection 879

M y~ = we can define E i = l ( i - YiL) 2 + E M I ( Y U -- y/)2 as a Total Sum of Squares (SST), E M 1 (Yi h=l -

M u 5h=1)2 YiL) 2 + Y].i=I(Y~ - as a Regression Sum of Squares (SSR), and 2 EM(Y~ n=l - y~)2 as

Error Sum of Squares (SSE) [8]. Then, SST = SSE + SSR.

To measure the interpretation ability of the input variables to a system, Wang and Tsaur [8] have defined an index IC = SSR/SST = 1 - (SSE/SST). Therefore, the index IC is similar to R 2 in statistics.

However, if there have been N input variables already in the system, to measure the impact of a new variable denoted by XN+I to output ]7", a fuzzy linear regression equation can be presented

by Y = f i X + f i N + I X N + I . Because SSE(X) measures the variation in Y when X is included in

the model, and SSE(X, X N + I ) measures the variation in ]P when all of X and XN+I are included,

hence the relative marginal change of the variation in l:" associated with XN+I when X already exists in the model is

S S E ( X ) - S S E ( X , X N + I ) _= SSE(XN+I [ X ) SSE(X) SSE(X)

This was defined as the coefficient of partial IC between Y and XN+I given tha t X is already in

the model and denoted by ICN+II1, 2 ..... N. Applying the value of partial IC, a forward selection procedure to select independent variables

was derived. I t proceeds with choosing a new input variable with the smallest SSE value when a variable set has been considered in a regression equation. Therefore, at each stage the model

has the smallest est imated error and the entering variable has the largest marginal reduction in

the variation of the selected new variable. The procedure continues until all of the considered input variables with partial IC values are negative. In order to reduce the computer storage and computer time, the proposed search process did not consider all possible subsets of variables to

guarantee optimality. However, it provides an acceptable subset by trading off between computer cost and information content in the fuzzy regression equation with crisp data.

However, if the collected da ta is crisp-input and fuzzy-output, then the equality relation in (3)

cannot be held any more. With its concept of "the less the values of SSE and SSR, the less is the value of SST", if adding a variable can reduce more of the value of SST, then the selected variable has bet ter interpretability in a fuzzy regression equation. Tha t is, when considering a new variable, we can use two criteria of SSE and SSR to measure SST.

3. V A R I A B L E S E L E C T I O N F O R D A T A T Y P E O F C R I S P - I N P U T A N D F U Z Z Y - O U T P U T

3.1. T a n a k a ' s M o d e l

Tanaka [2] was the first to s tudy problems with crisp-input and fuzzy-output. In his model,

each collected fuzzy output da tum i, i = 1 ,2 , . . . I M, is assumed to be a fuzzy number with a

symmetr ic tr iangular membership function denoted by ])i = (/]i, e~), where Yi is a center and ei is its spread as

1 [Y~ - Yi~ Yi - ei < Yi < ~]i + e~, u f~ ~ ( yi ) = ei ' - - i = 1 , 2 , . . . , M . (4) [ 0, otherwise,

The basic model is assumed to be a fuzzy linear regression equation as

? = A0 + LilX1 + . . - + AsX j + . . . + iiNXN = ~iX, (5)

fi, j = (aS,Cj) with its central value aj and its spread value cj, for j = 0 , 1 , . . . , N ; then the membership function is defined as (6) below,

1 [aj - aj__........~[ aj - - C s < a s < o~ s -]- cj, u j j ( a s) = cj ' -- -- j = 0 , 1 , 2 , . . . , N . (6)

0, otherwise,

Page 4: Bicriteria variable selection in a fuzzy regression equation

880 H . - F . WANG AND R.-C. TSAUR

By applying the extension principle [11], we have the following linear program (7):

Min E cj lXij , j = 0

N N

s.t. E,jx + (1 - h) cj Iz,jI _> + (1 - j=0 j=0

Vi = 1 , 2 , . . . , M , (7)

N N

- E a j X i j + (1 - h) E cj IXijl > -Y i + (1 - h)ei, j=0 j=0

Vi = 1 , 2 , . . . , M ,

cj > 0 , c~jER, X i 0 = l , 0 < h < l , V j = I , 2 , . . . , M , j = 0 , 1 , . . . , N .

Defining criterion 1 as minimizing (YU - Yi h=l) + (Yi h=l - Yi L) = cXi , and criterion 2 as . . . . x-'M ryh=l _ ~i)2, impact of the selected variables to the fuzzy regression equation mm~mlzmg z_~i=l~ i

can be obtained after solving model (7). Thereafter, both of the criteria can be applied to determine the effect of any new variable to the system. For simplicity, we shall denote the two criteria as crl and cr2, respectively, in the following sections.

3.2. C o m p r o m i s e d So lu t i ons

Because there are two criteria being considered, therefore there may be selected variables which are nondominated. When such a case happens, "the least resistance principle" suggested by Zeleny [12] is adopted so that the most satisfactory solution is the one that is the closest to

the ideal solution.

DEFINITION 1 [13]. An ideal solution is a solution that optimizes all individual criteria simulta-

neously.

The first entering variable is chosen by solving model (7) N times for N variables of which crk(Xj) can be obtained from the k th criterion of the jth variable. By defining crk(Xm) as the optimal value in the criterion k as crk(Xm) = mini crk(Xj), the degree of distance crk(Xj)

to crk(Xm) can be defined by d ( c r k ( X j ) , c r k ( X m ) ) - d3k -= crk (Xm) /Crk (X j ) . If crk(Xj) = crk(Xm) , then d~ = 1, otherwise, 0 < d~ < 1 where k = 1, 2. Then, to consider the overall distance of two criteria, we can use a family of distance membership functions defined by L~(A) =

[E~=I "kk ~ )~p (1 _.4J~kl~Plj 1/p, where )~ is a vector of the importance of the criteria levels p with I _< p <_ oc. Then, LJ()~) evaluates the distance of every selected variable to the ideal solution with both

criteria. By setting different values of p, the corresponding membership function of distance can

be defined as follows. I f p = 1,

2 2 2

w i t h L J ( A ) - - 1 - E A i d J , if E A k k = l k = l k = l

I f p = 2,

I f p = ~ ,

= 1. (8)

= (9)

(1-4) (10)

Therefore, when there exist some nondominated variables, we can find a variable that is closest to the ideal solution based on such distant measures.

Page 5: Bicriteria variable selection in a fuzzy regression equation

Bicri ter ia Variable Selection 881

3.3. Variable Se lect ion M e t h o d

For a fuzzy regression equation with crisp-input variables X1, X2, . . . ,XN and fuzzy-output variable Y, the concept of breadth-first search in the branch-and-bound algorithm is adopted to branch all of the possible variable combinations and bound them by the nondominated vari- able whose values in criterion 1 and criterion 2 are better than others. The branch-and-bound algorithm is described as follows.

(1) Branch: Starting at the first iteration to branch all individual variables and testing the values of cr l (Xj) and cr2(Xj), for each variable j = 1, 2 , . . . , N.

(2) Bound: For every variable, binding them by optimal values of two criteria crl(X~) and cr2(X~), respectively, where r E {j I J = 1 , . . . , N}, and defining the variable subset in iteration w as I(•) and I(1) = {Xr} when ~ = 1.

(3) Fathom: For any variable, if its cr l (Xj) and cr2(Xj) is larger than both of the values crl(Xr) and cr2(Xr), r e ( j [ j = 1 , . . . , N } , then it is fathomed. This is defined as F(1). However, if there exist some variables which are nondominated, then they should be fathomed by "the least resistance principle". This is defined as F(2).

(4) Optimality test: First, two criteria of c r l (Xx ,X2 , . . . , XN) and cr2(X1,X2, . . . , XN) of the total set are calculated. Second, if a DM considers that the value t which represents the degree of difference in the two criteria between the existing variable subset and the total variable set is better than his required value s as

max ( crl( total set) , cr2(total set) t -- 1 - \ c r ~ g ~ e e t ) c r ~ g ~ e t ) ] --- s,

then the process stops. Otherwise, go to Step 5. (5) Continue: To continue the branch-and-bound algorithm by entering new variables and

w = w + 1. Then, I(w) = I(w - 1) [J {Xj}, where Xj e {Xx , . . . ,XN} / I (w - 1). Go to Step 1.

4. A N I L L U S T R A T E D E X A M P L E

For illustration, an example adopted from [2] is presented in Table 1. Besides, we set the stopping value s = 0.3 and ~1 = )~2 = 0.5 for the equal importance of the two criteria. Based

Table 1. I n p u t - o u t p u t d a t a concern ing house price.

No. ~ i = (~i,ei) Xo X1 X2 X3 X4 X5

1 (6060,500) 1 1 38.09 36.43 5 1

2 (7100,50) 1 1 61.1 26.5 6 1

3 (8080,400) 1 1 63.76 44.71 7 1

4 (8260,150) 1 1 74.52 38.09 8 1

5 (8650,750) 1 1 75.38 41.4 7 2

6 (8520,450) 1 2 52.99 26.49 4 2

7 (9170,700) 1 2 62.93 26.49 5 2

8 (10310,200) 1 2 72.04 33.12 6 3

9 (10920,600) 1 2 76.12 43.06 7 2

10 (12030,100) 1 2 90.26 42.64 7 2

11 (13940,350) 1 3 85.7 31.33 6 3

12 (14200,250) 1 3 95.27 27.64 6 3

13 (16010,300) 1 3 105.98 27.64 6 3

14 (16320,500) 1 3 79.25 66.81 6 3

15 (16990,650) 1 3 120.50 32.25 6 3

Page 6: Bicriteria variable selection in a fuzzy regression equation

882 H.-F. WANG AND R.-C. TSAUR

Table 2. The result of variable selection analysis.

Stage I

Stage II

Stage III

Stage IV

Stage V

crl

cr2

crl

cr2

crl

cr2

crl

cr2

Xl X2 X3 X4 X5

57575 62039.7 90975 84475 41655

0.76243 x l0 s 1.2738 x l0 s 3.67 x 108 4.06 x 108 0.959 x l 0 s

XsX1 X5X2 X5X3 X5X4

37425 37942.5 41404.4 41112

0.60933 x 108 0.86914 x l0 s 0.9067 x l0 s 0.924 x 10 s

X5X1X4 XsX1X2

26664.98

0.21766 x I0 s

X 5 Xl X3

32137.33

0.42641 x 10 s

X s X l X 2 X s X5X1X2X4

19361.4 25276.81

0.12533 x 10 s 0.27611 x 10 s

XsXlX2XgX4

crl 19361.4

cr2 0.12533 x I0 s

28620

0.32956 x l0 s

Table 3. Analysis of a compromised solution.

crl

cr2

4

LI

X1 X5

57575

0.76243 x l0 s

0.7235

1

0.13825

0.13827

0.13825

41655

0.959 x 108

1

0.795

0.10325

0.1027

0.10325

on t h e a b o v e b r a n c h - a n d - b o u n d a l g o r i t h m , we o b t a i n t h e resu l t s shown in T a b l e 2. In S t a g e I,

s ince X1 a n d X5 a re n o n d o m i n a t e d var iab les , we have app l i ed " the leas t r e s i s t ance p r inc ip l e "

d e s c r i b e d in S e c t i o n 4.1 and s h o w n t h e resu l t in T a b l e 3.

P r o m T a b l e 3, we found t h a t based on t h e d i s t a n t measu res , X5 shou ld be se l ec t ed as an e n t e r i n g

va r i ab le . C o n t i n u i n g t h e b r a n c h - a n d - b o u n d m e t h o d , in S t a g e I I I , we o b t a i n a s u b s e t X s X I X 2 as a s t o p p i n g subse t b e c a u s e t = 0.274 < s = 0.3. I f t h e D M w o u l d like to c o n t i n u e t h e process ,

t h e n crl(X5X1X2Xa) = crl(XsX1X2X3X4) and cr2(XsX1X2X3) = cr2(X5X1X2X3X4), a n d

t h u s t h e o p t i m a l subse t o f va r i ab l e se lec t ion is XIX2X3Xs . T h e b r a n c h i n g p r o c e d u r e c a n also be v i ewed f rom F i g u r e 1.

5. C O N C L U S I O N

In th i s s tudy , we p r o p o s e d a va r i ab le se lec t ion m e t h o d based on two c r i t e r ia . B y a p p l y i n g

t h e c o n c e p t of b r e a d t h - f i r s t - s e a r c h f rom t h e b r a n c h - a n d - b o u n d a l g o r i t h m , an eff icient s u b s e t of

va r i ab l e s on b o t h cos t and i n f o r m a t i o n has b e e n c o n s t r u c t e d w i t h n u m e r i c a l i l l u s t r a t i on .

Page 7: Bicriteria variable selection in a fuzzy regression equation

Bicriteria Variable Selection 883

crl(X/) ~ crl(X5 )

cr2(.\"~.V).~': cr I(.V~.V,~_ ~ c'r/(X,.VtX fl

F( 1 ) crl(,V~..Vr:U,,k~3) .~/" ~ F(1 ) cr2(.Vr~',A"~A~": ""-~r I (A'rVr-V.,'~)

F(1)

Figure 1. The proposed variable selection procedure.

Fur the r s tudies on a Backward selection method or a s tep-by-step selection me thod can be

considered for comparison. Current ly, we are applying the proposed me thod to real world systems.

R E F E R E N C E S 1. K.J. Kim, H. Moskowitz and M. Koksalan, Fuzzy versus statistical linear regression, European Journal of

Operational Research 92, 417-434, (1996). 2. H. Tanaka, S. Uejima and K. Asai, Linear regression analysis with fuzzy model, IEEE Trans. S.M.C. 12,

903-907, (1982). 3. B. Heshmaty and A. Kandel, Fuzzy linear regression and its applications to forecasting in uncertain environ-

ment, Fuzzy Sets and Systems 15, 159-191, (1985). 4. M. Sakawa and H. Yano, Fuzzy linear regression and its application to sales forecasting, Policy and Infor-

mation 13 (2), 111-125, (1989). 5. A. Bardossy, I. Bogardi and L. Duckstein, Fuzzy regression in hydrology, Water Resources Res. 26 (7),

1497-1508, (1990). 6. P.T. Chang, S.A. Konz and E.S. Lee, Applying fuzzy linear regression to VDT legibility, Fuzzy Sets and

Systems 80, 197-204, (1996). 7. Y.J. Lai and S.I. Chang, A fuzzy approach for multiresponse optimization: An off-line quality engineering

problem, Fuzzy Sets and Systems 63, 117-129, (1994). 8. H.F. Wang and R.C. Tsaur, Insight of a fuzzy regression model, Fuzzy Sets and Systems 112, 355-369,

(2000). 9. H.F. Wang and R.C. Tsaur, Forecasting in fuzzy systems, Soft Computing, (submitted).

10. H.F. Wang and R.C. Tsaur, Resolution of fuzzy regression model, EJOR (to appear). 11. L.A. Zadeh, The concept of a linguistic variable and its application to approximate reasoning I, Information

Sci. 8, 199-249, (1975). 12. M. Zeleny, Multiple Criteria Decision Making, McGraw-Hill, (1982). 13. J.L. Cohon, Multiobjective Programming and Planning, Academic Press, New York, (1978). 14. H.F. Wang and R.C. Tsaur, Solving fuzzy regression by fuzzy goal programming, Computers O.R., (submit-

ted).