orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem...

12
Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components Wei-Chang Yeh Integration & Collaboration Laboratory, Department of Industrial Engineering and Engineering Management, National Tsing Hua University, Taiwan article info Article history: Received 16 May 2013 Received in revised form 25 February 2014 Accepted 16 March 2014 Available online 28 March 2014 Keywords: Reliability Series–parallel system Redundancy allocation problem (RAP) Simplified swarm optimization (SSO) Orthogonal array test (OA) abstract This work presents a novel orthogonal simplified swarm optimization scheme (OSSO) that combines repetitive orthogonal array testing (ROA), re-initialize population (RIP), and SSO for solving intractable large-scale engineering problems. This scheme is applied to the series–parallel redundancy allocation problem (RAP) with a mix of components. RAP involves setting reliability objectives for components or subsystems in order to meet the resource consumption constraint, e.g., the total cost. RAP has been an active area of research for the past four decades. The difficulties confronted by RAP are to maintain fea- sibility with respect to three nonlinear constraints, namely, cost-, weight-, and volume-related con- straints. As evidence of the utility of the proposed approach, we present extensive computational results on random test problems. The computational results compare favorably with previously devel- oped algorithms in the literature. The results in this paper show that the proposed OSSO can perform excellently in a limited computation time. Ó 2014 Elsevier B.V. All rights reserved. 1. Introduction The RAP aims to determine a system structure to increase the reliability for the minimum cost of manufacturing either by exchanging the components (subsystems) with more reliable com- ponents and/or using redundant components in parallel [2,3,4,6,7,10,11,12,13,14,15,16,17,18,19,23]. During the past thirty years, RAP has attracted considerable attention due to its wide and valuable applications to improve the reliability of various engineering systems among various reliability problems in the designing phase. RAP is perhaps the most common problem in design-for-reliability; thus, RAP is increasingly becom- ing an important tool in the initial stages of or prior to the planning, designing, and control of systems in the real world applications [12,13], such as engineering applications, industrial applications, and scientific applications. In the field of engineering applications, RAP can be applied to electrical engineering, hydraulic engineering, structural engineering, aeronautical engineering, and robotics and control. For scientific applications, RAP has been studied in the con- text of chemistry, physics, medicine, and computer science. A general formulation of the series–parallel RAP, (e.g., Fig. 1), can be formulated as the following integer nonlinear programming problem [7]: Maximize RðXÞ ð1Þ subject to CðXÞ 6 C UB ð2Þ WðXÞ 6 W UB : ð3Þ The objective function in Eq. (1) maximizes R(X) and indicates that the items for redundancy should be used in parallel at one combination to satisfy the function at the system level. Moreover, if an item is used, all its sibling items should be used or its function should be satisfied by the corresponding child items. Eqs. (2) and (3) indicate that the total cost, C(X), and weight, W(X), must be less than or equal to the predefined allowable amounts, C UB and W UB , respectively. RAP, with its mix of components, allows a subsystem to be duplicated with different sets of components. The above series–parallel RAP is explained further via the following example: Example 1. X = (33300000, 11000000, 33300000, 33330000, 22200 000, 22000000, 11100000, 11110000, 12000000, 12200000, 330000- 00, 11110000, 11000000, 34000000) is a solution representing 14 subsystems (i.e., Nsub = 14), and each subsystems includes at least one redundancy (i.e., L i = 1 for i = 1, 2, ... , 14) and at most eight redundancies (i.e., U i = 8 for i = 1, 2, ... , 14), i.e., 112 variables in total (i.e., Nvar = 112), as shown in Fig. 2. Table 1 yields the following: RðXÞ¼½1 ð1 r 1;3 Þ 3 ½1 ð1 r 14;3 Þð1 r 14;4 Þ ¼ 0:982853; ð4Þ http://dx.doi.org/10.1016/j.knosys.2014.03.011 0950-7051/Ó 2014 Elsevier B.V. All rights reserved. Tel.: +886 3 574 2443; fax: +886 3 572 2204. E-mail address: [email protected] Knowledge-Based Systems 64 (2014) 1–12 Contents lists available at ScienceDirect Knowledge-Based Systems journal homepage: www.elsevier.com/locate/knosys

Upload: wei-chang

Post on 30-Dec-2016

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

Knowledge-Based Systems 64 (2014) 1–12

Contents lists available at ScienceDirect

Knowledge-Based Systems

journal homepage: www.elsevier .com/ locate /knosys

Orthogonal simplified swarm optimization for the series–parallelredundancy allocation problem with a mix of components

http://dx.doi.org/10.1016/j.knosys.2014.03.0110950-7051/� 2014 Elsevier B.V. All rights reserved.

⇑ Tel.: +886 3 574 2443; fax: +886 3 572 2204.E-mail address: [email protected]

Wei-Chang Yeh ⇑Integration & Collaboration Laboratory, Department of Industrial Engineering and Engineering Management, National Tsing Hua University, Taiwan

a r t i c l e i n f o a b s t r a c t

Article history:Received 16 May 2013Received in revised form 25 February 2014Accepted 16 March 2014Available online 28 March 2014

Keywords:ReliabilitySeries–parallel systemRedundancy allocation problem (RAP)Simplified swarm optimization (SSO)Orthogonal array test (OA)

This work presents a novel orthogonal simplified swarm optimization scheme (OSSO) that combinesrepetitive orthogonal array testing (ROA), re-initialize population (RIP), and SSO for solving intractablelarge-scale engineering problems. This scheme is applied to the series–parallel redundancy allocationproblem (RAP) with a mix of components. RAP involves setting reliability objectives for components orsubsystems in order to meet the resource consumption constraint, e.g., the total cost. RAP has been anactive area of research for the past four decades. The difficulties confronted by RAP are to maintain fea-sibility with respect to three nonlinear constraints, namely, cost-, weight-, and volume-related con-straints. As evidence of the utility of the proposed approach, we present extensive computationalresults on random test problems. The computational results compare favorably with previously devel-oped algorithms in the literature. The results in this paper show that the proposed OSSO can performexcellently in a limited computation time.

� 2014 Elsevier B.V. All rights reserved.

1. Introduction Maximize RðXÞ ð1Þ

The RAP aims to determine a system structure to increase thereliability for the minimum cost of manufacturing either byexchanging the components (subsystems) with more reliable com-ponents and/or using redundant components in parallel[2,3,4,6,7,10,11,12,13,14,15,16,17,18,19,23].

During the past thirty years, RAP has attracted considerableattention due to its wide and valuable applications to improve thereliability of various engineering systems among various reliabilityproblems in the designing phase. RAP is perhaps the most commonproblem in design-for-reliability; thus, RAP is increasingly becom-ing an important tool in the initial stages of or prior to the planning,designing, and control of systems in the real world applications[12,13], such as engineering applications, industrial applications,and scientific applications. In the field of engineering applications,RAP can be applied to electrical engineering, hydraulic engineering,structural engineering, aeronautical engineering, and robotics andcontrol. For scientific applications, RAP has been studied in the con-text of chemistry, physics, medicine, and computer science.

A general formulation of the series–parallel RAP, (e.g., Fig. 1),can be formulated as the following integer nonlinear programmingproblem [7]:

subject to CðXÞ 6 CUB ð2ÞWðXÞ 6WUB: ð3Þ

The objective function in Eq. (1) maximizes R(X) and indicatesthat the items for redundancy should be used in parallel at onecombination to satisfy the function at the system level. Moreover,if an item is used, all its sibling items should be used or its functionshould be satisfied by the corresponding child items. Eqs. (2) and(3) indicate that the total cost, C(X), and weight, W(X), must be lessthan or equal to the predefined allowable amounts, CUB and WUB,respectively. RAP, with its mix of components, allows a subsystemto be duplicated with different sets of components. The aboveseries–parallel RAP is explained further via the following example:

Example 1. X = (33300000,11000000,33300000,33330000,22200000,22000000,11100000,11110000,12000000,12200000,330000-00,11110000,11000000,34000000) is a solution representing 14subsystems (i.e., Nsub = 14), and each subsystems includes at leastone redundancy (i.e., Li = 1 for i = 1, 2, . . . , 14) and at most eightredundancies (i.e., Ui = 8 for i = 1, 2, . . . , 14), i.e., 112 variables intotal (i.e., Nvar = 112), as shown in Fig. 2.

Table 1 yields the following:

RðXÞ ¼ ½1� ð1� r1;3Þ3� � � � ½1� ð1� r14;3Þð1� r14;4Þ� ¼ 0:982853;

ð4Þ

Page 2: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

Nomenclature

AcronymISC the improved surrogate constraint methodOA orthogonal array testingPSO particle swarm optimizationRAP redundancy allocation problemROA repetitive OASSO simplified swarm optimization, called discrete PSO

(DPSO) beforeOSSO the proposed orthogonal SSONotationsNvar the number of variables in SSO/OSSONpop the number of solutions in SSO/OSSO in each generationNgen The total number of independent generations in SSO/

OSSOq the random number uniformly distributed in [0,1]pBest, gBest pBest represents the best one of a specified solution

has achieved so far; gBest represents the best value ofall solutions so far

cw, cp, cg, cr The parameters represent the probabilities of the newvariable value generated from the current solution,pBest, gBest and a random number in SSO, respectively,where cw + cp + cg + cr = 1

Cw, Cp, Cg Cw = cw, Cp = Cw + cp, and Cg = Cp + cg. Note that Cg = 1 � cr

Nsub the total number of subsystemsNroa the minimal number to implement the proposed ROA to

each solution in each generationrij, cij, wij the reliability, cost, and weight of the jth type of

components in the ith subsystem, respectively. Notethat rij e (0,1) for all i, j

Li, Ui the lower- and upper-bounds of redundancy number ofthe ith subsystems, respectively. Note that

PNsubi¼1 Ui =

Nvar and U0 = 0

R(X), C(X) ,W(X) the total reliability R(X) =QNsub

i¼1 ½1�QUi

j¼1

ð1� ri;xkÞ�; the total cost C(X) =

PNvark¼1 CðxkÞ ¼

PNsubi¼1 ci;xk

;

and the total weight W(X) =PNvar

k¼1 WðxkÞ ¼PNvar

k¼1 wi;xk;

wherePi�1

j¼0Uj < k 6Pi

j¼0Uj

CUB, WUB the upper bounds of cost and weight, respectivelyxt,i,j = xtij the values of the jth variable in the ith solution at the tth

generation, where i = 1, 2, . . . , Npop, j = 1, 2, . . . , Nvar,and t = 1, 2, . . . , Ngen, and xtij denote that the xtijth typeof component is used in the hth subsystem, whereU0 = 0 and j is an integer such that

Ph�1k¼0Uk < j 6Ph

k¼0Uk

Xti Xti = (xti1, xti2, xti,Nvar) is the ith solution at the tth gener-ation, where i = 1, 2, . . . , Npop and t = 1, 2, . . . , Ngen

Pi Pi = (pi1, pi2, . . . , pi,Nvar) is the current pbest w.r.t. the ithsolution, where i = 1, 2, . . . , Npop

G G = (g1, g2, . . . , gNvar) is the current gBestF(�) the fitness function value of solution �La(bc) the general symbol for the b-level standard OAs, where

b is the number of levels for each factor (variable),a = bdlogbðcþ1Þe is the number of rows (experimental runs),and c is the number of columns (factors, variables) [5].Note that d�e is the smallest integer number that isgreater than or equal to �, e.g., [5.2] = 6

aij the value in the ith row and the jth column in La(bc)Rmax, Ravg, Rmin the obtained maximal, average, and minimal

reliability, respectivelyTmax, Tavg, Tmin the average times required to obtain Rmax, Ravg,

and Rmin, respectivelytmax, tavg, tmin the average fitness evaluation numbers required to

obtain Rmax, Ravg, and Rmin, respectivelyRstd, Tstd, tstd the reliability standard deviation, CPU seconds, and

generation numbers, respectively

2 W.-C. Yeh / Knowledge-Based Systems 64 (2014) 1–12

CðXÞ ¼ 3c1;3 þ 2c2;1 þ � � � þ ðc14;3 þ c14;4Þ ¼ 196; ð5Þ

WðXÞ ¼ 3w1;3 þ 2w2;1 þ � � � þ ðw14;3 þw14;4Þ ¼ 119: ð6Þ

RAP is a well-known NP-hard problem with computationaleffort growing exponentially with the number of nodes and linksin the system [2]. Onishi et al. proposed the improved surrogateconstraint method (ISC) and reported that ISC can efficiently andeffectively obtain more exact optimal solutions to Fyffe’s RAP[17]. ISC may be the best-known method for solving the RAP. How-ever, ISC is a mathematically based algorithm, and evidence thatISC can easily solve all larger RAPs without highly computationalcomplexity is lacking.

Table 1Data for the Fyffe’s RAP [4].

rij cij wij

j

i 1 2 3 4 1 2 3 4 1 2 3 4

1 .90 .93 .91 .95 1 1 2 2 3 4 2 52 .95 .94 .93 2 1 1 8 10 93 .85 .90 .87 .92 2 3 1 4 7 5 6 44 .83 .87 .85 3 4 5 5 6 45 .94 .93 .95 2 2 3 4 3 56 .99 .98 .97 .96 3 3 2 2 5 4 5 47 .91 .92 .94 4 4 5 7 8 9

Hence, the main focus has been on developing approximationmethods, such as the Linear Programming Approach [7], TabuSearch [11], Genetic Algorithm [3], Ant Colony Optimization [15],and the Variable Neighborhood Search Algorithm [14], to solvethe RAP while avoiding numerical difficulties and reducing compu-tational burdens. In this paper, a novel algorithm, OSSO, isproposed to solve larger RAP, which is not an easy problem fortraditional ISC.

This paper is organized as follows. The overview of SSO is givenin Section 2. The OA, conditional OA, and bounded ROA are pro-posed in Section 3. The overall scheme of the proposed OSSO isgiven in Section 4. Section 5 illustrates the solution to the RAPusing the proposed OSSO via three experiments. Finally, conclud-ing remarks and future works are summarized in Section 6.

rij cij wij

j

i 1 2 3 4 1 2 3 4 1 2 3 4

8 .81 .90 .91 3 5 6 4 7 69 .97 .99 .96 .91 2 3 4 3 8 9 7 8

10 .83 .85 .90 4 4 5 6 5 611 .94 .95 .96 3 4 5 5 6 612 .79 .82 .85 .90 2 3 4 5 4 5 6 713 .98 .99 .97 2 3 2 5 5 614 .90 .92 .95 .99 4 4 5 6 6 7 6 9

Page 3: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

x1

x2

xU1

xU1+1

xU1+2

xU1+U2

xυ+1

xυ+2

xυ+U

Fig. 1. An example of series–parallel RAP, where t ¼PNsub�1

k¼1 Uk .

Fig. 2. The structure corresponding to X.

Table 2The example for the update procedure in SSO.

i x3,2,i p2,i gi qi x4,2,i Remark

1 3 4 2 .991 1a Cg < q1

2 3 2 4 .875 4 Cp < q2 < Cg

3 3 3 2 .428 3 Cw < q3 < Cp

4 0 2 1 .195 1 Cp < q4 < Cg

5 0 0 1 .698 1 Cp < q5 < Cg

6 0 1 0 .875 0 Cp < q6 < Cg

7 0 1 2 .428 1 Cw < q7 < Cp

8 0 1 1 .195 0 q8 < Cw

a A random integer generated in [0,4].

W.-C. Yeh / Knowledge-Based Systems 64 (2014) 1–12 3

2. Overview of simplified swarm optimization (SSO)

SSO is an emerging population-based stochastic optimizationmethod [19,20,21,22]. It belongs to the category of Swarm Intelli-gence methods; it is also an evolutionary computation method.Because the proposed improved OSSO is based on the SSOproposed by Yeh in [19], it is also considered a populationapproach. Both SSO and improved OSSO are easy and flexible interms of coding and update operators. Before discussing theproposed OSSO, SSO is formally introduced in this section.

The advantages of SSO are simplicity, efficiency, and flexibility[19,20,21,22]. In SSO, a solution is encoded as a finite-length string.Each solution has a fitness value, which is determined by the fit-ness function to be optimized. Like most soft computing tech-niques, SSO is also initialized with a population of randomsolutions inside the problem space and then searches for optimalsolutions by updating generations.

Analogous to PSO [8,9,24], each solution moves toward its bestprevious solution (i.e., pBest) and toward the best solution in theentire swarm during each generation (i.e., gBest) in SSO. Both pBestand gBest are adopted directly from PSO. However, a randommovement is added to SSO to maintain population diversity andenhance the capacity of escaping from a local optimum[19,20,21,22]. Thus, each solution is a compromise among the cur-rent solution, the pBest, the gBest, and a random movement.

SSO combines local search and global search to yield a highsearch efficiency. In the whole swarm in every generation, Xt,i isupdated to Xt+1,i with the following simple mathematical equationafter cw, cp, and cg are given:

xtþ1;i;j ¼

xt;i;j if q 2 ½0;CwÞpi;j if q 2 ½Cw;CpÞpgBest;j if q 2 ½Cp;CgÞx if q 2 ½Cg ;1Þ

8>>><>>>:

: ð7Þ

Notably, SSO emerged as a generalization of PSOs. The updatemechanism is the major difference between PSO and SSO. Theupdate mechanism of SSO is much simpler than those of othermajor soft computing techniques, like PSO (which must calculateboth the velocity and position functions), the Genetic Algorithm(which requires such genetic operations as crossover and muta-tion), the Estimation Distribution of Algorithm (which has diffi-culty building an appropriate probability model [1]), and theImmune System Algorithm (which does not consider the interac-tion of variables [1]). Using the simple approaches above, thediversity of a population can be efficiently maintained. Addition-ally, SSO has been proven effective in exploring large and complexspaces in many optimization problems [19,20,21,22]. The detailedsteps of SSO are described in the following:

PROCEDURE SSO

� STEP S0. Generate X0i, calculate F(X0i), let X0i = Pi and t = 1,where i = 1, 2, . . . , Npop.� STEP S1. Let i = 1.� STEP S2. Update Xt�1,i to Xti based on Eq. (7) and calculate F(Xti).� STEP S3. If F(Xti) is better than F(Pi), then let Pi = Xti. Otherwise,

go to STEP S5.

� STEP S4. If F(Pi) is better than F(PgBest), then let gBest = i.� STEP S5. If i < Npop, let i = i + 1 and go to STEP S2.� STEP S6. If t = Ngen and/or CPU time are met, then halt; other-

wise let t = t + 1 and go back to STEP S1.

The above update procedure in SSO is explained in the followingexample.

Example 2. Let Cw = .2, Cp = .5, Cg = .99, and X3,2 = X, which weregiven in Example 1. The first eight elements (i.e., all components inthe first subsystem) of X4,2 obtained from X3,2 using the aboveupdated mechanism of SSO based on the related elements of P2,PgBest, and qi are listed in Table 2.

3. Conditional bounded ROA

This work proposes ROA as a local search algorithm to improvesolutions. A local search algorithm starts from a candidate solutionand then iteratively moves to a better neighbor solution [5,14,20].This ROA is an extension of OA, which is a systematic and statisticaltesting method [5,20].

3.1. OA

The purpose of using OA is to systematically and efficientlyproduce a potentially good approximation [5,20]. Rather thanexploring all possible combinations of assignments, OA can prunesome solutions based the orthogonal array during a search. Anorthogonal array is an array of numbers arranged in rows (tests/combinations) and columns (factors/variables) such that each col-umn is statistically independent of other columns, i.e., orthogonal.The OA is a special statistical design of experiments (DOEs) thatstudies the effects of several factors simultaneously to efficientlydetermine the best combination of factor levels to use in designproblems. The DOE is the process of planning and performingexperiments to efficiently collect and analyze appropriate and nec-essary technical information [5,20].

Let La(bc) be the OA of a-test b-level c-factor standard OAs,where a ¼ bdlogbðcþ1Þe [5,20] and [�] is the smallest integer number

Page 4: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

Table 3L9(34) OA (9 tests, 4 factor, 3 levels).

Test Factor

1 2 3 4

1 1 1 1 12 1 2 2 23 1 3 3 34 2 1 2 35 2 2 3 16 2 3 1 27 3 1 3 28 3 2 1 39 3 3 2 1

4 W.-C. Yeh / Knowledge-Based Systems 64 (2014) 1–12

that is greater than or equal to �. The current value of the selectedvariable may be increased by one, decreased by one, or leftunchanged to achieve a better result. Hence, b is set to 3 in thisstudy. If the value of c is increased, then so is the value of a, i.e.,more tests are needed. Thus, the class of the three-level OAL9(34) is used in this work. Table 3 presents an example of L9(34).In L9(34), numbers 1, 2, and 3 in each column indicate the levelsof factors, and each column contains an equal number of 1s, 2s,and 3s. The array is orthogonal when nine pairs, (1,1), (1,2),(1,3), (2,1), (2,2), (2,3), (3,1), (3,2), and (3,3), appear the samenumber of times in any two columns [5,20]. Algorithms for con-structing OAs with various levels are found in [5,20].

3.2. Conditional ROA

SSO performs satisfactory global searches but may take a longtime to converge to an optimal solution. ROA is a useful localsearch tool, but it may become trapped in a local optimal and mustbe implemented heuristically, which requires extra runtime.Therefore, ROA is implemented for each updated solution toachieve a trade-off between exploration and exploitation when

q <ggBest

3 � GEN; ð8Þ

after the initial population procedure, where q is a uniform randomnumber in the range [0,1] and ggBest is the generation number foundby the current gBest. This condition is obtained by trial and error.

3.3. Bounded ROA

Each population requires 10 additional fitness evaluations if OAis implemented once. Only four variables change during these 10additional fitness evaluations. ROA is implemented respective toOA if the current fitness values improve. The proposed ROAincreases the quality of the obtained solution but requires extraexecution time. To increase the effectiveness of ROA, the boundedlimit ROA is implemented with the four selected variables asfollows:

1. Lower-bound limit: If C(X) > CUB, W(X) > WUB, or Eq. (8) issatisfied, only non-zero variables can be selected in OA.

2. Upper-bound limit: If all three conditions listed are unsatisfied,then any variable equal to its upper-bound Nroa cannot beselected in OA.

3.4. The overall procedure of ROA

The proposed conditional bounded ROA integrates the tradi-tional OA [5,20] (STEPs R2–R5) with a novel design (STEP R6) tocreate new values for the selected bounded variables (STEP R1)within the conditions in Section 3.2 (STEP R0), as given below.

PROCEDURE ROA

� STEP R0. If Eq. (8) is dissatisfied, then halt. Otherwise, constructL9(34).� STEP R1. If C(X) > CUB or W(X) > WUB is satisfied, let A = {four dis-

tinctive non-zero variables selected randomly}. Otherwise, letA = {four distinctive variables selected randomly and each valueis less than its upper bounds}.� STEP R2. Let h = 1, X⁄ = X, and F⁄ = F(X).� STEP R3. Update X to X(h) = (xh1, xh2, . . . , xhn) and calculate F(X(h))

as follows:

xhj ¼

1 if ahi ¼ 2 and xhj ¼ 0xhj � 1 if ahi ¼ 2 and xhj > 0xhj if ahi ¼ 1xhj þ 1 if ahi ¼ 3 and xhj < uj

0 if ahi ¼ 3 and xhj ¼ uj

8>>>>>><>>>>>>:

ð9Þ

for all jeA.� STEP R4. If F(X⁄) < F(X(h)), let X⁄ = X(h) and F(X⁄) = F(X(h)).� STEP R5. Let h = h + 1. If h < 10, h = 10, and h = 11, go to STEPs

R3, R6, and R7, respectively.� STEP R6. Let a10,i = j if Sij = Max{Sik|i = 1,2,3} P Sik for all i e A,

where Sik ¼P9

h¼1FðXðhÞÞ

3 for xhi = k. Go to STEP R3.� STEP R7. If F⁄ < F(X⁄), let X = X⁄ and F(X) = F(X⁄). Otherwise, halt.� STEP R8. If q < Nroa, go to STEP R0.

The proposed conditional bounded ROA integrates the tradi-tional OA [5,20] (STEPs R3–R5) with a novel design (STEP R3) tocreate new values for the selected bounded variables (STEP R2)within the conditions in Section 3.2 (STEP R0). Note that each iter-ation of ROA contains ten solutions: the 1st alternative solution isequal to the original solution and does not require that its fitnessfunction value be calculated; the 2nd to the 9th alternative solu-tions are all obtained from STEP R3, and the 10th alternative solu-tion is obtained from STEP R4. If the change improves its currentfitness function value, the change is performed and the current fit-ness function value is updated in STEP R6. If not, the solutionsremain the same and the process stops. Variable s is the numberof fitness evaluations, and the proposed ROA is terminated if F(X)is not improved (STEP R5) and s P Nora (STEP R6).

For example, let X be given in the last example, CUB = 191,WUB = 130, and the 1st, 17th, 67th, and 97th variables be selectedrandomly from X. The result of the first run of this ROA procedurefor X is presented in Table 4.

This example includes four variables in three different settings.Therefore, the OA reduces the test number from 34 = 81 times to9 + 1 = 10 times in each generation. The conditions for using theOA are not considered in the proposed ROA, which may indicate thatthe proposed ROA cannot find the best combination. In the example,the proposed ROA systematically obtains a potentially good approx-imation without exploring all possible change combinations.

4. The proposed OSSO for RAP

The proposed OSSO is based on the fundamental concept of thestandard SSO, ROA (discussed in Section 3), and re-initial popula-tion (RIP). This section presents the fitness function, initial popula-tion, RIP, and overall procedure for the proposed OSSO for solvingRAPs.

4.1. The fitness function

The fitness function based on a penalty reliability function isimplemented to guide a search toward unexplored regions in the

Page 5: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

Table 4The result of the first run of the example using the proposed rot.

j=UBj⁄ 1/4 17/4 67/4 97/3 F(X(h))a R(X(h)) C(X(h))a W(X(h))a

h ah1/xhj ah2/xhj ah3/xhj ah4/xhj

1 1/3 1/3 1/1 1/1 .9095 .9829 196 1192 1/3 2/2 2/0 2/0 .9547 .9547 182 1173 1/3 3/4 3/2 3/2 .9248 .9841 195 1244 2/2 1/3 2/0 3/2 .9737 .9737 190 1175 2/2 2/2 3/2 1/1 .8830 .9837 198 1216 2/2 3/4 1/1 2/0 .9646 .9646 191 1197 3/4 1/3 3/2 2/0 .9060 .9641 195 1188 3/4 2/2 1/1 3/2 .8832 .9839 198 1229 3/4 3/4 2/0 1/1 .9745 .9745 189 12010 2/2 3/4 2/0 2/0 .9552 .9552 183 117

Sj1 .9297 .9297 .9191 .9223Sj2 .9404 .9070 .9676 .9417Sj3 .9212 .9546 .9046 .9272

⁄ The corresponding upper-bound of variable xhj given in Table 1.a The detailed of calculating the related values are given in Section 4.1.

W.-C. Yeh / Knowledge-Based Systems 64 (2014) 1–12 5

solution space [14,15,18,21]. For a solution X with the total systemcost C(X) and/or weight W(X) that exceed CUB and WUB, respec-tively, the fitness function is calculated as follows:

FðXÞ ¼RðXÞ if CðXÞ 6 CUB and WðXÞ 6WUB

RðXÞ �Min CUBCðXÞ ;

WUBWðXÞ

h i3� �

otherwise

8<:

ð10Þ

Table 5The main-effect of different setting of Nrip and Nroa.

Statistics Nroa Nrip

50 150

Ravg 0.0 0.969413a 0.9692260.1 0.972033 0.9721040.4 0.973305 0.973313a

0.7 0.973652a 0.9736101.0 0.973781a,b 0.973751b

Average 0.972437 0.972401

Tavg 0.0 0.120130a,b 0.117763b

0.1 0.142977 0.1404280.4 0.217776 0.2098140.7 0.304318 0.2917311.0 0.433619 0.404056Average 0.243764 0.232758

tavg 0.0 50374.95b 50197.93b

0.1 97738.36 97507.050.4 262045.53 254813.160.7 455906.76 439606.771.0 730977.64 682375.40Average 319408.65 304900.06

Rstd 0.0 0.010486a 0.0106710.1 0.009783a 0.0098210.4 0.009695 0.009667a

0.7 0.009645a,b 0.0096821.0 0.009654 0.009660b

Average 0.009853 0.009900

Tstd 0.0 0.007252a,b 0.007798b

0.1 0.010999 0.0081840.4 0.050818a 0.0524090.7 0.115467 0.1120221.0 0.203735 0.194020Average 0.077654 0.074887

tstd 0.0 51.44b 41.70b

0.1 6214.45 5164.230.4 61048.43a 63462.640.7 172828.60 169224.381.0 342906.22 327500.84Average 116609.83 113078.76

a,b The best value among these values of the same row and column, respectively.

where the value of the exponent, in this case 3, is preset to anamplification parameter.

Eq. (10) encourages solutions to explore the feasible and infea-sible regions near the border of the feasible area such that thesearch does not excessively deviate into the infeasible region. Thus,the promising feasible and infeasible regions in the search spaceare efficiently and effectively explored to identify an optimal ornear-optimal solution. The following example demonstrates the

300 400 500 Average

0.968904 0.968919 0.968805 0.9690530.971997 0.972009 0.972046a 0.9720380.973239 0.973220 0.973217 0.9732590.973543 0.973525 0.973530 0.9735720.973711b 0.973667b 0.973673b 0.9737170.972279 0.972268 0.972254

0.116863b 0.116508b 0.116476b 0.1175480.139323 0.138612a 0.138818 0.1400320.206972 0.206783 0.206112a 0.2094910.288842 0.283729a 0.284257 0.2905760.401041 0.400031 0.393237a 0.4063970.230608 0.229133 0.227780

50130.30b 50105.20b 50100.00b 50181.6897306.19 97017.61a 97202.13 97354.27252267.33 252435.65 251824.04a 254677.14436120.43 428928.41a 429976.51 438107.77677769.20 676363.01 665125.45a 686522.14302718.69 300969.98 298845.63

0.010715 0.010751 0.010967 0.0107180.009870 0.009866 0.009860 0.0098400.009707 0.009711 0.009709 0.0096980.009694 0.009685b 0.009701 0.0096810.009638a,b 0.009689 0.009666b 0.0096610.009925 0.009940 0.009981

0.007814b 0.007793b 0.007789b 0.0076890.008114a 0.009416 0.008822 0.0091070.052743 0.052267 0.051841 0.0520160.111253 0.111885 0.110842a 0.1122940.192493 0.186172 0.186295a 0.1925430.074484 0.073506 0.073118

45.97b 22.21b 0.00a,b 32.274890.39a 5961.61 5394.91 5525.1264317.82 63255.13 63246.03 63066.01168131.89 169054.17 167834.42a 169414.69325132.48 314943.44 314496.86a 324995.97112503.71 110647.31 110194.44

Page 6: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

Table 6Two-way ANOVA for Ex1.

Group Source Degree of freedom Sum of square error Mean square F value P value

Reliability Nrip 4 0.00028 0.0000707 0.72 0.580Nroa 4 0.15000 0.0374994 380.44 0.000Interaction 16 0.00031 0.0000194 0.20 1.000Error 49,475 4.87671 0.0000986Total 49,499 5.02730S = 0.009928 R2 = 3.00% R2(adj) = 2.95%

FEN Nrip 4 2.63599E+12 6.58997E+11 23.80 0.000Nroa 4 2.71119E+15 6.77798E+14 24479.16 0.000Interaction 16 3.65675E+12 2.28547E+11 8.25 0.000Error 49,475 1.36990E+15 2.76888E+10Total 49,499 4.08739E+15S = 166,399 R2 = 66.48% R2(adj) = 66.47%

Runtime Nrip 4 1.62 0.405 38.52 0.000Nroa 4 553.47 138.368 13158.86 0.000Interaction 16 1.12 0.070 6.67 0.000Error 49,475 520.24 0.011Total 49,499 1076.45S = 0.1025 R2 = 51.67% R2(adj) = 51.65%

6 W.-C. Yeh / Knowledge-Based Systems 64 (2014) 1–12

calculation of the above fitness function using the solution X givenin Example 1.

Example 3. Assume that X is given in Example 1 and let CUB = 191and WUB = 30. Thus,

FðXÞ ¼ RðXÞ �MinCUB

CðXÞ ;WUB

WðXÞ

� �3( )

¼ 0:982853 �Min191196

;30

119

� �3( )

¼ 0:909538: ð11Þ

4.2. Initial population

Most soft computing techniques randomly generate the initialpopulation. However, using a high-quality initial populationreduces runtime. Therefore, the proposed ROA in Section 3 isimplemented to improve the quality of each initial populationwhen it is better than the current gBest in the initial populationprocedure. If a new solution is better than gBest, gBest is immedi-ately replaced.

4.3. Rip

The current gBest may be a premature solution if it isunchanged for a certain number of generations, t � t⁄, where g isthe current generation and t⁄ is the number of generations thatfound the current gBest. A premature solution is a local optimizerthat traps all solutions in the local optimizer without improvingthe solution quality. The value of t � t⁄ positively correlates withthe probability that the current gBest is a premature solution.Therefore, the proposed OSSO is restarted after recording the cur-rent gBest to enhance the capacity to escape the local optimum ift � t⁄ > Nrip (obtained based on DOE). Each RIP procedure startsan era, and only the best gBest from all eras is the final solution.

4.4. The overall procedure of OSSO

According to the discussion in the previous subsection, Sections2 and 3, the steps of the overall proposed OSSO are described asfollows.

� STEP 0. Let gBest = 1 and t = 0.� STEP 1. Let i = 1.

� STEP 2. Generate X0i with Nvar variables randomly, calculateF(X0i), and let X0i = Pi and t = t + 1.� STEP 3. If F(Pi) > F(PgBest), let gBest = i, improve X0i using the

proposed ROA discussed in Section 3.2, and update tcorrespondingly.� STEP 4. If i < Npop, let i = i + 1 and go to STEP 2.� STEP 5. Let i = 1.� STEP 6. Update Xti based on Eq. (7) listed in Section 2, and

calculate F(Xti).� STEP 7. Improve Xti using the proposed ROA.� STEP 8. If F(Xti) > F(Pi), then let Pi = Xti. Otherwise, go to STEP 10.� STEP 9. If F(Pi) > F(PgBest), then let t⁄ = t and gBest = i.� STEP 10. If i < Npop, let i = i + 1 and go to STEP 6.� STEP 11. If t = Ngen, then halt and PgBest is the final solution with

reliability F(PgBest).� STEP 12. If t–t⁄ < Nrip, then let t = t + 1 and go to STEP 6.� STEP 13. Let t⁄ = t, t = t + 1, and go to STEP 1.

5. Numerical examples

To evaluate the quality and performance of the proposed OSSOfor RAPs, this study includes three experiments, Ex1, Ex2, and Ex3.Ex1 examines the effects of the proposed ROA and RIP to find thebest setting of ROA and RIP in the proposed OSSO using 25 designsfor the most famous of Fyffe’s 33-variation 14-subsystem bench-mark series–parallel RAP, which was originally proposed by Fyffeet al. [4] and revised by Nakagawa [16] with Nsub = 14(Nvar = 14 � 8 = 112). Ex2 compares the proposed OSSO based onthe result obtained from Ex1 with existing algorithms when theyare applied to Fyffe’s RAP. Ex3 tests the proposed OSSO using fournew larger scale RAPs for Nsub = 28 (Nvar = 224), 42 (Nvar = 336),56 (Nvar = 448), and 70 (Nvar = 560).

In each experiment, component mixing is considered withoutrestrictions. Each subsystem features three or four alternativecomponent choices as well as Li = 1 and Ui = 8 for i = 1, 2, . . . , Nsub.The corresponding component reliability, cost, and weight arelisted in Table 1 of Section 1 for the 1st to the 14th subsystems.The component reliability, cost, and weight of the (i + 13)thsubsystem are copied from those of the ith subsystem in Table 1for these problems with Nsub = 28, 42, 56, and 70 in Ex3. The costconstraint is fixed to C = 130 in Ex1 and Ex2 and C = 130�Nsub/14 inEx3, and the weight constraint is 159–191 in Ex1 and Ex2 and159�Nsub/14, 160�Nsub/14, . . . , 191�Nsub/14 in Ex3, i.e., 33variations are available for Nsub = 28, 42, 56, and 70.

Page 7: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

Table 7The successful rates in obtaining the best-known solutions, i.e. the accuracy.

Weight Nroa

Nrip = 50 Nrip = 150 Nrip = 300 Nrip = 400 Nrip = 500

0.1 0.4 0.7 1 0.1 0.4 0.7 1 0.1 0.4 0.7 1 0.1 0.4 0.7 1 0.1 0.4 0.7 1

159 .000 .067 .433 .650 .000 .233 .317 .600 .000 .150 .167 .633 .000 .150 .333 .550 .000 .167 .417 .650160 .033 .117 .433 .650 .033 .200 .400 .517 .017 .100 .267 .467 .033 .167 .300 .383 .000 .133 .217 .400161 .033 .167 .600 .700 .000 .267 .450 .650 .017 .167 .317 .783 .000 .233 .383 .550 .050 .117 .467 .533162 .017 .300 .567 .750 .017 .433 .650 .700 .033 .350 .617 .683 .017 .317 .517 .650 .017 .250 .567 .683163 .033 .100 .400 .450 .000 .167 .300 .517 .000 .167 .300 .350 .017 .183 .300 .217 .000 .183 .233 .433164 .000 .117 .400 .633 .017 .133 .333 .617 .000 .167 .333 .500 .000 .150 .383 .383 .033 .217 .300 .483165 .000 .133 .367 .533 .000 .167 .450 .583 .017 .183 .317 .483 .000 .150 .283 .583 .000 .117 .433 .433166 .050 .133 .217 .333 .000 .167 .133 .417 .017 .183 .233 .383 .033 .083 .250 .267 .000 .183 .250 .217167 .000 .100 .233 .517 .017 .100 .217 .417 .000 .117 .250 .383 .033 .067 .317 .433 .000 .117 .233 .250168 .000 .217 .483 .767 .017 .150 .433 .750 .033 .167 .450 .717 .017 .183 .467 .617 .000 .283 .483 .733169 .000 .367 .817 .900 .017 .467 .867 .967 .050 .433 .817 .933 .067 .517 .850 .917 .000 .583 .750 .933170 .000 .283 .667 .833 .017 .283 .717 .750 .017 .333 .633 .800 .033 .300 .683 .767 .017 .267 .617 .767171 .100 .633 .883 .950 .083 .617 .833 .933 .000 .700 .833 .917 .050 .567 .817 .900 .100 .583 .850 .867172 .183 .767 .900 .983 .083 .767 .917 .867 .183 .717 .833 .900 .200 .683 .800 .867 .250 .550 .617 .750173 .000 .217 .400 .783 .000 .167 .467 .817 .000 .317 .417 .633 .050 .300 .533 .633 .017 .250 .500 .767174 .000 .383 .767 .950 .050 .500 .767 .983 .033 .533 .867 .950 .017 .617 .867 .917 .050 .483 .950 .933175 .033 .300 .550 .800 .017 .500 .767 .883 .050 .467 .767 .883 .000 .567 .800 .850 .000 .667 .833 .917176 .000 .400 .683 .783 .000 .300 .650 .783 .017 .467 .700 .733 .067 .350 .533 .717 .050 .383 .683 .850177 .000 .083 .250 .367 .000 .067 .267 .350 .000 .083 .150 .267 .000 .067 .133 .150 .000 .117 .133 .233178 .000 .000 .083 .300 .000 .017 .133 .200 .000 .000 .117 .250 .000 .017 .167 .200 .000 .017 .083 .300179 .000 .033 .267 .583 .000 .117 .267 .500 .017 .083 .383 .467 .017 .117 .367 .433 .000 .100 .200 .450180 .000 .133 .367 .500 .000 .100 .333 .483 .000 .167 .367 .600 .000 .117 .350 .517 .000 .167 .383 .450181 .000 .133 .350 .550 .000 .200 .383 .633 .017 .067 .383 .533 .000 .150 .300 .417 .000 .117 .183 .383182 .000 .017 .200 .217 .000 .083 .200 .333 .000 .067 .183 .183 .000 .017 .133 .283 .000 .033 .150 .233183 .000 .017 .200 .267 .000 .067 .283 .633 .000 .100 .367 .367 .000 .167 .350 .433 .000 .100 .350 .300184 .000 .100 .350 .550 .017 .250 .367 .533 .017 .083 .267 .450 .017 .183 .317 .433 .000 .150 .367 .467185 .000 .017 .167 .300 .017 .083 .333 .433 .000 .133 .383 .433 .000 .150 .300 .550 .000 .167 .317 .550186 .000 .283 .483 .583 .000 .167 .450 .533 .000 .367 .333 .400 .017 .200 .400 .400 .000 .217 .333 .483187 .000 .000 .067 .150 .000 .050 .083 .117 .000 .000 .067 .083 .017 .000 .067 .150 .000 .017 .100 .167188 .000 .017 .117 .183 .000 .033 .133 .167 .000 .000 .133 .283 .000 .000 .233 .233 .000 .033 .167 .350189 .000 .017 .067 .100 .000 .000 .100 .250 .000 .050 .050 .167 .000 .017 .100 .217 .000 .050 .117 .283190 .000 .000 .000 .017 .000 .000 .000 .033 .000 .000 .050 .067 .000 .017 .000 .033 .017 .000 .050 .033191 .000 .000 .033 .100 .000 .000 .017 .133 .000 .017 .067 .050 .000 .017 .100 .150 .000 .033 .017 .050

Sum 0.483 5.650 12.80 17.73 0.400 6.850 13.02 18.08 0.533 6.933 12.42 16.73 0.700 6.817 12.73 15.80 0.600 6.850 12.35 16.33Accuracya 24% 88% 97% 100% 39% 91% 97% 100% 45% 88% 100% 100% 52% 94% 97% 100% 30% 97% 100% 100%

a (The number of obtaining solutions after rounding up to five decimal places that are equal to the related solutions obtained from ISC)/33.

W.-C.Yeh

/Know

ledge-BasedSystem

s64

(2014)1–

127

Page 8: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

Table 8The tavg for the related data in Table 7.

Weight (Nroa,Nrip)

(1,50) (1,150) (.7,300) (1,300) (1,400) (.7,500) (1,500)

159 977116.8 911318.9 526675.6 885968.0 875075.4 522998.2 866240.5160 434082.8 347603.7 183562.1 379448.3 290588.2 226798.4 223043.4161 980966.8 919031.4 529995.7 900238.6 883972.7 523810.5 878167.8162 353976.7 310618.1 216636.7 384308.8 318113.6 208412.1 313544.2163 465029.4 401091.9 210339.2 334065.0 348592.3 233659.5 412396.3164 539879.0 363124.5 227011.0 357754.7 303637.5 245557.8 288931.9165 479799.8 360105.9 281897.8 397871.3 380958.4 227549.2 345310.4166 448967.7 538229.7 272855.4 407382.4 313239.4 274729.3 216572.2167 403285.9 449461.7 264371.9 465883.3 329669.0 195891.1 368324.6168 435120.4 460294.1 217637.8 376092.0 314009.0 250316.9 373878.9169 308527.5 322927.4 200903.0 285486.2 287156.6 209872.0 287814.4170 412978.4 363698.8 231698.2 328892.6 325643.4 208195.2 392011.1171 314178.1 320623.5 206800.6 256399.1 318144.2 164475.0 270547.6172 277619.6 344216.2 155595.2 236165.0 262795.3 149248.6 162846.1173 394817.1 380195.7 255218.8 347427.3 348043.4 199141.3 314462.6174 375121.6 308629.8 197904.4 278520.8 244952.0 224748.0 348161.2175 423976.1 350203.1 236246.0 337596.3 314470.0 201271.6 345361.7176 430904.3 396350.2 235487.2 302688.6 405757.9 222866.5 401036.5177 455231.7 486261.2 176879.7 378031.9 397298.1 175902.9 367733.1178 578748.7 451297.8 333922.6 392101.9 422214.1 260850.6 460153.0179 1003994.4 949522.1 539825.7 926927.2 915031.4 537406.5 907629.7180 493184.6 430433.5 272862.6 434549.3 414995.3 230522.0 463570.1181 453175.8 428976.2 285326.6 469810.4 427563.8 177854.2 445784.2182 446727.2 425334.4 296749.3 326447.4 328327.7 182477.8 273781.6183 536489.0 473456.7 285334.9 329750.8 373084.4 296743.9 405230.2184 523841.1 465492.9 293931.4 448812.6 357228.4 238905.9 368791.1185 1018711.7 962070.1 546847.3 940793.3 926883.6 544437.3 925125.8186 517031.0 485461.4 261869.7 391737.9 370190.8 268587.9 340023.5187 466495.8 769508.9 228615.8 486411.6 524271.3 290565.7 446144.5188 397953.2 429494.0 262294.4 559446.0 465087.9 245221.1 443734.0189 600135.0 570530.5 510575.0 594468.7 459796.0 363268.1 514386.9190 330893.0 356801.5 381459.0 435254.0 528262.0 328589.7 255359.0191 446644.0 563433.1 271256.3 756141.7 683553.3 233564.0 660471.3

tavg 506836.5 487751.5 290866.3 458571.9 438139.6 268619.4 426865.7

8 W.-C. Yeh / Knowledge-Based Systems 64 (2014) 1–12

All SSO-based methods implemented in both experiments werecoded in the C++ programming language and conducted on an IntelCore i7 3.07 GHz PC with 6 GB memory. The runtime unit isexpressed in CPU seconds. Taken directly from [21], the proposedOSSO used 500 generations (Ngen = 500), 60 independent runs,and 100 populations (Npop = 100) with cw = .2, cp = .3, cg = .49,and cr = .01 in all SSO-based algorithms. For each ROA imple-mented, at least 10 additional fitness evaluations will be includedin each generated solution.

5.1. Ex1: parameter setting for ROA and RIP

The ROA and RIP are two factors in the proposed OSSO. Twenty-five related methods under five levels, .0, .1, .4, .7, and 1.0 for Nroaand 50, 150, 300, 400, and 500 for Nrip, are tested against Fyffe’sRAP in this experiment.

The experimental results are summarized as follows:

1. The effect of the RIP: The Nrip value negatively correlates withthe quality of the solution and positively correlates with thesize of the standard deviation, the FEN and its standard devi-ation, and the better runtime and its standard deviation. How-ever, the difference is only less than 0.0002, 250,000, and0.125 s between the best level and the worst level for the solu-tion quality, FEN, and the runtime, respectively. This differencearises because smaller Nrip values are more likely to reinitial-ize all solutions. Therefore, smaller Nrip values can enhancethe global search to prevent the trapping of gBest at the localoptimum. As a result, the local search is weakened, as eachsolution would always restart in the middle of the convergentprocess.

2. The effect of the ROA: Table 5 shows that the proposed ROAincreases the effectiveness of all final gBests. Furthermore, ahigher Nroa will also incur a higher chance and frequency toperform ROA. This probability increases the ability of the localsearch to find an optimal solution under the cost of runtimeand higher FEN.

3. The effect of the ROA and RIP: Table 6 shows that the ROAenhances the likelihood of obtaining better solutions andrequires more runtime and FEN to converge to the optimal solu-tion than RIP. The above situation also can be observed fromTable 5, which shows that the values of Nroa and Nrip positivelycorrelate with the search performance and global convergencebased on the results for other levels of the ROA and RIP. Addi-tionally, the table shows seven settings for (Nroa,Nrip) = (1,50),(1,150), (1,300), (.7,300), (1,400), (.7,500), and (1,500), ofwhich at least one of the obtained final gbests (after roundingup to five decimal places) is equal to the best known solutionfor each variant, as shown in Table 7. Unexpectedly, (Nroa,N-rip) = (1,150) is more successful in achieving the best-knownsolutions, and (.7,500) shows a lower average FEN in obtainingthe best-known solutions among these settings in Table 8.Therefore, the proposed OSSO is implemented with Nroa = .7and Nrip = 500 in the rest of the experiment, as it increases boththe efficiency and effectiveness.

5.2. Ex2: Comparing the OSSO with existing algorithms

Table 9 shows the performance of the proposed OSSO usingNroa = .7 and Nrip = 500, including the weight (WUB) and cost(CUB) limitations over 60 random seeds for each variant of Fyffe’sRAP [4].

Page 9: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

Table 9Performed by OSSO over 60 random seeds each for Fyffe’s RAP.

WUB Rmax Rmin Ravg Rstd Tmax Tmin Tavg Tstd tmax tmin tavg tstd

159 .954565 .950409 .953846 .00079 .360 .328 .344 .00501 538,431 512,511 523,809 5816160 .955714 .952826 .955452 .00058 .360 .031 .299 .09056 541,824 45,505 457,832 140,071161 .958035 .955068 .957285 .00076 .360 .328 .343 .00701 532,554 511,305 523,145 5264162 .959188 .957021 .958808 .00057 .359 .046 .228 .11804 535,902 52,034 344,932 181,345163 .960642 .958736 .960047 .00055 .359 .046 .298 .09334 540,924 67,182 456,168 141,278164 .962422 .960215 .961626 .00056 .360 .047 .289 .09422 535,434 67,741 440,031 145,152165 .963712 .962221 .963252 .00057 .360 .031 .260 .11527 533,751 55,128 395,721 176,915166 .965042 .961718 .964630 .00060 .360 .031 .304 .08720 540,816 43,440 462,855 134,815167 .966335 .964840 .965890 .00041 .360 .046 .294 .10176 538,332 69,195 447,079 155,889168 .968125 .965457 .967492 .00069 .360 .031 .257 .10736 534,534 48,383 391,714 164,585169 .969291 .967619 .969025 .00050 .360 .031 .191 .11949 536,451 44,110 288,811 181,490170 .970760 .968784 .970512 .00042 .360 .031 .219 .11861 543,246 50,238 330,458 181,806171 .971930 .970623 .971801 .00034 .360 .031 .145 .10722 531,438 42,581 218,781 163,509172 .973027 .971402 .972756 .00046 .360 .031 .194 .13540 539,547 51,473 294,374 205,108173 .973827 .972380 .973612 .00047 .360 .031 .244 .12340 542,211 39,441 365,833 186,232174 .974926 .973555 .974899 .00018 .359 .031 .158 .09702 530,835 52,679 239,969 146,472175 .975708 .974727 .975659 .00018 .360 .032 .169 .10737 543,993 46,332 256,577 160,344176 .976690 .975759 .976567 .00024 .360 .031 .212 .11536 539,205 48,708 320,860 177,766177 .977596 .975964 .977305 .00040 .360 .047 .323 .08455 543,660 58,478 488,184 128,987178 .978400 .976539 .978000 .00037 .360 .078 .339 .05638 545,379 117,764 513,184 84,220179 .979505 .977869 .978808 .00042 .375 .343 .357 .00668 546,270 531,267 538,484 3634180 .980290 .979153 .979872 .00039 .360 .062 .279 .11316 546,027 77,818 420,555 170,298181 .981027 .979348 .980371 .00043 .376 .047 .314 .09657 553,929 76,332 474,289 147,570182 .981518 .980192 .981159 .00038 .376 .031 .323 .09271 551,301 55,708 486,379 138,929183 .982256 .980381 .981850 .00043 .375 .032 .304 .09593 557,241 46,114 457,115 144,001184 .982994 .981839 .982601 .00039 .375 .032 .288 .11361 550,671 55,195 431,778 170,882185 .983505 .981922 .983143 .00046 .375 .344 .361 .00743 562,254 532,959 544,545 5479186 .984176 .981502 .983751 .00059 .375 .031 .301 .10389 558,726 52,266 453,154 155,004187 .984688 .983516 .984391 .00036 .376 .078 .346 .06435 556,404 125,247 519,995 95,448188 .985378 .983975 .984845 .00032 .375 .047 .330 .08396 557,277 74,191 496,881 125,464189 .985922 .984232 .985297 .00041 .375 .078 .349 .05109 556,341 120,063 525,290 76,122190 .986416 .984265 .985707 .00055 .376 .063 .359 .04140 558,474 75,966 537,127 63,246191 .986811 .984853 .986238 .00038 .376 .156 .361 .02795 564,585 233,564 543,318 41,018

Average .973952 .972088 .973530 .00046 .366 .081 .284 .08400 545,090 123,664 429,977 127,399

Table 10The ANOVA of the test results from Table 9.

Group Source Degree of freedom Sum of square error Mean square F value P value

Reliability Factor 2 0.0000631 0.0000315 0.32 0.729Error 96 0.0095297 0.0000993Total 98 0.0095927S = 0.009963 R2 = 0.66% R2(adj) = 0.00%

CPU seconds Factor 2 1.41427 0.70713 152.43 .000Error 96 0.44536 0.00464Total 98 1.85962S = 0.06811 R2 = 76.05% R2(adj) = 75.55%

Fitness evaluation number Factor 2 3.13146E+12 1.56573E+12 141.59 .000Error 96 1.06157E+12 11,057,996,686Total 98 4.19303E+12S = 105,157 R2 = 74.68% R2(adj) = 74.16%

W.-C. Yeh / Knowledge-Based Systems 64 (2014) 1–12 9

In Table 10, the ANOVA results show that Rmax, Ravg, and Rmin donot significantly differ. This lack of difference demonstrates thatthe proposed method has achieved robust solutions for each testproblem. However, significant differences were observed betweenTmax and Tavg, Tmin and Tavg, tmax and tavg, and tmin and tavg after usingthe Tukey method at a 95% confidence interval. Thus, the Tukeymethod showed that the proposed OSSO is an optimization algo-rithm for RAP robust in solution quality but not in runtime andthe number of fitness evaluations.

Table 11 shows the final reliability (the fitness function value ofthe gBest) obtained from the OSSO, SSO (100 solution and 500 gen-erations) [21], Linear Programming Approach [7], Genetic Algo-rithm (40 chromosomes and 1200 generations) [3], Ant ColonyOptimization (100 ants and up to 1000 iterations and localsearches) [15], Tabu Search [11], Variable Neighborhood SearchAlgorithm [14], and ISC [16,17]. The shaded parts in Table 11 show

the best result. In this small experiment, the solutions obtainedfrom ISC are optimal. The bottom rows (avg, accuracy, and eval)in Table 11 show the average best solutions for each method, theratio of obtained solutions to ISC solutions, and the average num-ber of fitness function evaluations.

The experiment results indicate that the proposed OSSO is bet-ter than the other methods, except for ISC listed in Table 11, andcan effectively solve Fyffe’s RAP. The OSSO performs as well asISC, which is probably the best-known method for solving theRAP. Thus, the OSSO achieves a competitive performance for smallfitness function evaluation costs.

5.3. Ex3: Large scale problems

These benchmark problems tested in Ex1 and Ex2 are more than30 years old. Therefore, in order to demonstrate the efficiency of

Page 10: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

Table 11Comparison of the best solutions among heuristics.

WUB Linear Programming[7]

Genetic Algorithm[3]

Ant Colony Optimization[15]

Tabu Search[11]

Variable Neighborhood Search[14]

ISC [17] SSO[21]

OSSO

191 .98671 .98675 .98675 .98681 .98681 .98681 .985476 .986811190 .98632 .98603 .98591 .98642 .98642 .98642 .985421 .986416189 .98572 .98556 .98577 .98592 .98592 .98592 .984447 .985922188 .98503 .98503 .98533 .98538 .98487 .98538 .983035 .985378187 .98415 .98429 .98469 .98469 .98467 .98469 .983304 .984688186 .98388 .98362 .98380 .98418 .98418 .98418 .982575 .984176185 .98339 .98311 .98351 .98351 .98351 .98351 .982022 .983505184 .98220 .98239 .98299 .98299 .98299 .98299 .982698 .982994183 .98147 .9819 .98221 .98226 .98226 .98226 .981466 .982256182 .97969 .98102 .98147 .98152 .98147 .98152 .980611 .981518181 .97928 .98006 .98068 .98103 .98103 .98103 .979643 .981027180 .97833 .97942 .98029 .98029 .98029 .98029 .979384 .980290179 .97806 .97906 .97951 .97951 .97951 .97951 .978698 .979505178 .97688 .97810 .97840 .97840 .97838 .97840 .978208 .978400177 .97540 .97715 .97760 .97747 .97760 .97760 .977243 .977596176 .97498 .97642 .97649 .97669 .97669 .97669 .976441 .976690175 .97350 .97552 .97571 .97571 .97571 .97571 .975569 .975708174 .97233 .97435 .97493 .97479 .97493 .97493 .974539 .974926173 .97053 .97362 .97383 .97383 .97381 .97383 .973807 .973827172 .96923 .97266 .97303 .97303 .97303 .97303 .973027 .973027171 .96790 .97186 .97193 .97193 .97193 .97193 .971930 .971930170 .96678 .97076 .97076 .97076 .97076 .97076 .969368 .970760169 .96561 .96922 .96929 .96929 .96929 .96929 .968591 .969291168 .96415 .96813 .96813 .96813 .96813 .96813 .967499 .968125167 .96299 .96634 .96634 .96634 .96634 .96634 .966204 .966335166 .96121 .96504 .96504 .96504 .96504 .96504 .963817 .965042165 .95992 .96371 .96371 .96371 .96371 .96371 .962885 .963712164 .95860 .96242 .96242 .96242 .96242 .96242 .960526 .962422163 .95732 .96064 .96064 .95998 .96064 .96064 .959424 .960642162 .95555 .95912 .95919 .95821 .95919 .95919 .957899 .959188161 .95410 .95804 .95804 .95692 .95804 .95804 .955708 .958035160 .95295 .95567 .95571 .95560 .95567 .95571 .954784 .955714159 .95080 .95432 .95457 .95433 .95457 .95457 .952812 .954565

Average .971665 .973677 .973899 .973851 .973934 .973954 .973002 .973952Accuracya 0% 21.21% 72.72% 78.79% 84.85% 100% 6% 100%Eval. 48,040 100,000 350,000 120,000 292,706 282,084

a (The number of obtaining solutions after rounding up to five decimal places that are equal to the related solutions obtained from ISC)/33.

10 W.-C. Yeh / Knowledge-Based Systems 64 (2014) 1–12

the OSSO on larger scale problems as well as create a new bench-mark problem, the original Fyffe’s RAP (Nsub = 14) in Ex 1 and Ex 2

Fyffe’s RAP Fyffe’s RAP Fyffe’s RAP Fyffe’s RAP

Fig. 3. The structure in the case with Nsub = 56 of Ex3.

151050

1.00

0.95

0.90

0.85

0.80

varian

reliability

Scatterplot of reliab

Fig. 4. The scatter diagram of al

were purposely repeated 2 (Nsub = 28), 3 (Nsub = 32), 4 (Nsub = 46)and 5 (Nsub = 70) times in series to test the performance of OSSO.All component weights, reliabilities, and costs of the (14 + i)th sub-system in Ex3 are repeats of that of the ith subsystem in Fyffe’sRAP. For example, when Nsub = 56 in Ex3, 4 systems are connectedin series, and the information of each system is exactly the same as

35302520t

28425670

Nsub

ility vs variant

l obtained reliability values.

Page 11: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

35302520151050

5

4

3

2

1

variant

runtime

28425670

Nsub

Scatterplot of runtime vs variiant

Fig. 5. The scatter diagram of all obtained runtimes.

35302520151050

2400000

2300000

2200000

2100000

2000000

1900000

1800000

1700000

1600000

1500000

variant

FEN

28425670

Nsub

Scatterplot of FEN vs variant

Fig. 6. The scatter diagram of all obtained FENs.

W.-C. Yeh / Knowledge-Based Systems 64 (2014) 1–12 11

that in Fyffe’s RAP, i.e. each system includes a total of 14 subsys-tems, and each subsystem includes at least one redundancy witha maximum of eight redundancies, as shown in Fig. 3.

The computational results for each problem are presentedgraphically using scatter diagrams for the variants of four problemsin Figs. 4 and 5. These figures indicate that the reliability, runtime,and FENs increase as the Nsub and/or weight increase. None of theobtained reliabilities and runtimes overlap among differentvariants and/or Nsubs. Unexpectedly, Fig. 4 does not show a fixedpattern. Based on Fig. 5, each variant can be solved in less than fiveseconds, even for Nsub = 70, which shows the efficiency of the pro-posed OSSO for larger problems (see Fig. 6).

Furthermore, the limits of the cost and weight of the ith variantare thus (Nsub/14) times that of the ith variant in Ex1. Therefore,the obtained best reliability of each variant is compared furtherwith the best-know solution to the power of (Nsub/14), which iscalled the ideal solution, to demonstrate the effectiveness of theproposed OSSO as shown in Table 12. Table 12 shows that theaccuracies are 79%, 21%, 0%, and 0% for Nub = 28, 42, 56, and 70,

respectively, i.e. the accuracy negatively correlates with Nsub.The average difference between the ideal solution and the obtainedsolution is only less than 0.0113 with tavg = 4.27 s, even forNsub = 70, for which the variable number is increased to70 � 8 = 560. Thus, our statistical results demonstrate that the pro-posed OSSO retains its quality even with additional observationsthat may be present in complex large-scale problems.

6. Conclusions & future works

In this work, an improved SSO called the OSSO was proposed tosolve RAP with a mix of components. The proposed OSSO combinesthe existing SSO approach with ROA and RIP to efficiently optimizeRAP. To the best of the author’s knowledge, ISC is currently themost accurate heuristic and mathematic algorithm/approach inthe literature [17]. The results in Section 5.2 indicate that OSSOcan be as good as ISC under the condition of rounding off to the5th decimal. In addition, the heuristic-based OSSO has the advan-tage of dealing with larger-scale problems with more variables

Page 12: Orthogonal simplified swarm optimization for the series–parallel redundancy allocation problem with a mix of components

Table 12The best results obtained from the proposed OSSO for each variant in Ex3.

Nsub 28 42 56 70

variant. R� T⁄ t⁄ R� T⁄ t⁄ R� T⁄ t⁄ R� T⁄ t⁄

1 0.0000 1.52 1,673,871 0.0000 2.27 1,748,400 0.0030 2.89 1,685,643 0.0057 3.58 1,654,5932 �0.0011 1.55 1,699,710 �0.0008 2.24 1,729,761 0.0005 3.09 1,833,774 0.0067 3.64 1,683,0243 0.0000 1.53 1,683,276 0.0021 2.36 1,839,111 0.0046 3.02 1,778,802 0.0115 3.72 1,730,0224 �0.0003 1.50 1,649,202 0.0001 2.25 1,741,722 0.0035 3.20 1,898,394 0.0089 3.78 1,770,2075 �0.0004 1.52 1,651,182 0.0001 2.28 1,756,608 0.0029 3.03 1,770,189 0.0077 3.78 1,753,8996 0.0000 1.52 1,665,447 0.0010 2.27 1,754,925 0.0047 3.06 1,806,549 0.0097 3.95 1,859,7397 0.0000 1.48 1,629,303 0.0003 2.39 1,852,818 0.0029 3.03 1,764,843 0.0118 3.89 1,826,2328 �0.0004 1.50 1,651,506 �0.0002 2.38 1,824,180 0.0027 3.11 1,812,885 0.0106 4.14 1,943,7279 �0.0005 1.52 1,642,749 �0.0007 2.31 1,786,650 0.0037 3.27 1,936,734 0.0067 4.16 1,958,046

10 0.0000 1.55 1,681,935 0.0009 2.36 1,817,385 0.0048 3.22 1,885,092 0.0119 3.94 1,832,21711 �0.0003 1.52 1,644,594 0.0006 2.42 1,872,294 0.0037 3.27 1,900,536 0.0117 4.09 1,902,36312 0.0000 1.52 1,653,675 0.0003 2.42 1,866,426 0.0050 3.42 2,015,016 0.0122 4.31 2,011,62313 0.0000 1.55 1,682,952 0.0004 2.39 1,841,802 0.0036 3.36 1,980,231 0.0091 4.22 1,979,88014 0.0000 1.56 1,704,804 0.0000 2.48 1,901,436 0.0043 3.28 1,903,182 0.0120 4.17 1,960,65615 �0.0003 1.58 1,713,534 �0.0002 2.45 1,885,461 0.0038 3.38 1,959,288 0.0114 4.34 2,031,28816 0.0000 1.55 1,673,808 0.0004 2.45 1,871,484 0.0051 3.49 2,036,400 0.0127 4.28 2,007,97817 �0.0002 1.56 1,698,072 0.0001 2.49 1,912,704 0.0040 3.50 2,037,399 0.0115 4.25 1,988,54718 0.0000 1.58 1,680,558 0.0001 2.45 1,869,090 0.0060 3.49 2,043,843 0.0110 4.39 2,042,50219 0.0000 1.55 1,678,875 0.0005 2.47 1,871,979 0.0043 3.42 1,980,600 0.0109 4.30 1,987,19720 �0.0003 1.56 1,680,261 �0.0001 2.44 1,854,186 0.0050 3.48 2,022,171 0.0137 4.55 2,123,65521 0.0000 1.56 1,653,099 0.0016 2.45 1,861,845 0.0057 3.39 1,957,011 0.0127 4.44 2,063,98522 0.0001 1.55 1,663,647 0.0007 2.47 1,859,307 0.0061 3.42 1,985,613 0.0133 4.52 2,113,42223 0.0003 1.59 1,697,730 0.0009 2.49 1,871,925 0.0063 3.50 2,027,193 0.0138 4.44 2,087,91624 �0.0003 1.56 1,673,484 0.0007 2.45 1,850,559 0.0042 3.39 1,967,091 0.0116 4.52 2,092,49725 �0.0001 1.58 1,680,666 0.0007 2.45 1,854,780 0.0051 3.44 1,976,892 0.0118 4.52 2,097,23126 0.0000 1.58 1,679,172 0.0007 2.45 1,840,668 0.0057 3.42 1,960,440 0.0128 4.52 2,094,37827 �0.0002 1.56 1,676,913 0.0007 2.44 1,831,461 0.0053 3.47 1,987,260 0.0117 4.58 2,089,39228 �0.0001 1.59 1,687,173 0.0015 2.45 1,841,460 0.0060 3.50 1,994,136 0.0115 4.63 2,141,97029 0.0002 1.63 1,703,490 0.0014 2.53 1,894,308 0.0051 3.47 1,975,011 0.0127 4.72 2,191,34430 0.0003 1.61 1,704,003 0.0018 2.53 1,892,715 0.0071 3.42 1,938,264 0.0125 4.64 2,127,68731 0.0001 1.59 1,696,146 0.0017 2.50 1,864,644 0.0062 3.49 1,985,388 0.0142 4.66 2,152,50032 0.0002 1.61 1,707,639 0.0020 2.52 1,873,311 0.0063 3.56 2,040,873 0.0134 4.61 2,114,98833 0.0003 1.59 1,681,278 0.0014 2.49 1,849,875 0.0057 3.53 2,015,043 0.0126 4.70 2,153,391

Average �0.0001 1.55 1,677,083 0.0006 2.42 1,841,978 0.0046 3.33 1,935,206 0.0113 4.27 1,986,912

R� , T⁄, t⁄: |the best obtained solution–ideal solution|, runtime, FEN of the best obtained solution.

12 W.-C. Yeh / Knowledge-Based Systems 64 (2014) 1–12

from Section 3. Thus, the proposed OSSO can systematically gener-ate a potentially good approximation with reasonable computationcosts.

Acknowledgments

I wish to thank the anonymous editor and the reviewers fortheir constructive comments and recommendations, which havesignificantly improved the presentation of this paper. This researchwas supported in part by the National Science Council of Taiwan,ROC under grant NSC101-2221-E-007-079-MY3.

References

[1] W.W. Chang, W.C. Yeh, P.C. Huang, A hybrid immune-estimation distributionof algorithm for mining thyroid gland data, Exp. Syst. Appl. 37 (2010) 2066–2071.

[2] M.S. Chern, On the computational complexity of reliability redundancyallocation in a series system, Operat. Res. Lett. 11 (1992) 309–315.

[3] D.W. Coit, A.E. Smith, Reliability optimization of series–parallel systems usinga genetic algorithm, IEEE Trans. Reliab. 45 (1996) 254–260.

[4] D.E. Fyffe, W.W. Hines, N.K. Lee, System reliability allocation and acomputation algorithm, IEEE Trans. Reliab. R-17 (1968) 64–69.

[5] S.J. Ho, S.Y. Ho, L.S. Shu, OSA: orthogonal simulated annealing algorithm and itsapplication to designing mixed H2/H1 optimal controllers, IEEE Trans. Syst.,Man Cybernet., Part A: Syst. Hum. 34 (2004) 588–600.

[6] T.J. Hsieh, W.C. Yeh, Penalty guided bees search for redundancy allocationproblem with a mix of components in series–parallel systems, Comp. Operat.Res. 39 (2012) 2688–2704.

[7] Y.C. Hsieh, A linear approximation for redundant reliability problems withmultiple component choices, Comp. Indust. Eng. 44 (2002) 91–103.

[8] J. Kennedy, R.C. Eberhard, Particle swarm optimization, in: Proceedings of IEEEInternational Conference on Neural Networks, Publishing, Piscataway, NJ, USA,1995, pp. 1942–1948.

[9] J. Kennedy, R.C. Eberhard, Y. Shi, Swarm Intelligence, Morgan Kaufmann, SanFrancisco, CA, 2001.

[10] H.G. Kim, C.O. Bae, D.J. Park, Reliability-redundancy optimization usingsimulated annealing algorithms, J. Qual. Main. Eng. 12 (2006) 354–363.

[11] S. Kulturel-Konak, A.E. Smith, D.W. Coit, Efficiently solving the redundancyallocation problem using Tabu search, IIE Trans. 35 (2003) 515–526.

[12] W. Kuo, V.R. Prasad, An annotated overview of system-reliability optimization,IEEE Trans. Reliab. 49 (2000) 176–187.

[13] W. Kuo, R. Wan, Recent advances in optimal reliability allocation, IEEE Trans.Syst., Man Cybernet., Part A: Syst. Hum. 37 (2007) 143–156.

[14] Y.C. Liang, Y.C. Chen, Redundancy allocation of series-parallel systems using avariable neighborhood search algorithm, Reliab. Eng. Syst. Saf. 92 (2007) 323–331.

[15] Y.C. Liang, A.E. Smith, An ant colony optimization algorithm for theredundancy allocation problem (RAP), IEEE Trans. Reliab. 53 (2004) 417–423.

[16] Y. Nakagawa, S. Miyazaki, Surrogate constraints algorithm for reliabilityoptimization problems with two constraints, IEEE Trans. Reliab. 30 (1981)175–180.

[17] J. Onishi, S. Kimura, R.J.W. James, Y. Nakagawa, Solving the redundancyallocation problem with a mix of components using the improved surrogateconstraint method, IEEE Trans. Reliab. 56 (2007) 94–101.

[18] L. Yao, Nonparametric learning of decision regions via the genetic algorithm,IEEE Trans. Syst., Man, Cybernet., Part B: Cybernet. 39 (1996) 313–321.

[19] W.C. Yeh, A two-stage discrete particle swarm optimization for the problem ofmultiple multi-level redundancy allocation in series systems, Exp. Syst. Appl.36 (2009) 9192–9200.

[20] W.C. Yeh, Novel swarm optimization for mining classification rules on thyroidgland data, Inform. Sci. 197 (2012) 65–76.

[21] W.C. Yeh, Simplified swarm optimization in disassembly sequencing problemswith learning effects, Comp. Operat. Res. 39 (2012) 2168–2177.

[22] W.C. Yeh, A new parameter-free simplified swarm optimization for artificialneural network training and its application in prediction of time–series, IEEETrans. Neural Netw. Learn. Syst. 24 (2013) 661–665.

[23] W.C. Yeh, T.J. Hsieh, Solving reliability redundancy allocation problems usingan artificial bee colony algorithm, Comp. Operat. Res. 38 (2011) 1465–1473.

[24] W.C. Yeh, Y.C. Lin, Y.Y. Chung, M.C. Chih, A particle swarm optimizationapproach based on Monte Carlo simulation for solving the complex networkreliability problem, IEEE Trans. Reliab. 59 (2010) 212–221.