neural networks and the part family/ machine group formation problem in cellular manufacturing: a...

12
Journal of Manufacturing Systems Vol. 14/No.3 1995 Neural Networks and the Part Family/ Machine Group Formation Problem in Cellular Manufacturing: A Framework Using Fuzzy ART Laura Burke and Soheyla Kamal, Lehigh University, Bethlehem, Pennsylvania Abstract We apply the fuzzy adaptive resonance theory (ART) neur- al network to the part family/machine group formation prob- lem in cellular manufacturing. Previous neural network appli- cations have demonstrated the potential role of competitive learning and ART networks in part family/machine cell for- mation, but they have a number of shortcomings. Fuzzy ART, based on a similarity measure from fuzzy set theory, shows great promise over other approaches. We present results for fuzzy ART applied to several test problems of part family formation and give an extension for systematically generating alternative solutions in the problem domain. Keywords: Neural Networks, Cellular Manufacturing, Part Family Formation, Group Technology 1.0--Introduction The purpose of this paper is to demonstrate the use of fuzzy adaptive resonance theory (ART) for machine cell and part family formation problems in group technology. Fuzzy ARTI belongs to the class of unsupervised, adaptive neural networks. A number of researchers have experimented with adaptive neural networks for cellular manufacturing, including Kao and Moon, ~ Caudell et al., 3 Malave and Rama- chandran, 4 Dagli and Huggahalli, 5 and Moon and Chi. n Dagli and Huggahalli used ART1 in such prob- lems, and Malave and Ramachandran used competi- tive learning. Fuzzy ART is the most recent adaptive resonance framework that provides a unified archi- tecture for both binary and continuous valued inputs. While fuzzy ART operations reduce to ART1 (which accepts only binary inputs) as a special case, para- meters available to fuzzy ART and its handling of nonbinary inputs make it function quite differently than ART1. We will demonstrate the application of fuzzy ART to part family formation problems and focus on mechanisms for improving and interpreting its performance in that domain. Several factors moti- vate the use of fuzzy ART, as follows: 1. By using one of the ART networks rather than the simpler competitive learning system, we exploit important stability properties of the network. Unlike competitive learning, when a new part or machine is added to the system, ART networks can continue to learn (without forgetting past learning) and incorporate new information. Also, the vigilance parameter of adaptive resonance introduces an adaptive characterization of the goals of the problem. It also aids in identifying significantly different parts or machines, while competitive learning alone cannot. 2. Fuzzy ART, unlike ART1,7 does not require a com- pletely binary representation of the parts to be grouped. While in this research we use only binary representations of parts, an important modification made to improve results relies on the ability to han- dle continuous inputs; however, fuzzy ART possess- es the same desirable stability properties as ART1 and a simpler architecture than that of ART2. 8 3. As Moore 9 showed, ART2--the version of ART for continuous valued inputs---can experience difficulty in achieving good categorizations if input patterns are not all normalized to constant length; however, such normalization can destroy valuable information. In addition, Dagli and Huggahalli s discovered a serious dependency of classification results in the case of ART1 on the sequence of input presentation. On closer inspec- tion, we determined that they have shown ART I can also experience the problem described by Moore, termed category proliferation. Carpenter, Grossberg, and Rosen I give methods for stem- ming category proliferation in fuzzy ART that are completely inappropriate in our domain, as we show. We provide a new method to improve per- formance of fuzzy ART for group technology. 148

Upload: laura-burke

Post on 21-Jun-2016

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14/No. 3 1995

Neural Networks and the Part Family/ Machine Group Formation Problem in Cellular Manufacturing: A Framework Using Fuzzy ART Laura Burke and Soheyla Kamal, Lehigh University, Bethlehem, Pennsylvania

Abstract We apply the fuzzy adaptive resonance theory (ART) neur-

al network to the part family/machine group formation prob- lem in cellular manufacturing. Previous neural network appli- cations have demonstrated the potential role of competitive learning and ART networks in part family/machine cell for- mation, but they have a number of shortcomings. Fuzzy ART, based on a similarity measure from fuzzy set theory, shows great promise over other approaches. We present results for fuzzy ART applied to several test problems of part family formation and give an extension for systematically generating alternative solutions in the problem domain.

Keywords: Neural Networks, Cellular Manufacturing, Part Family Formation, Group Technology

1.0--Introduction The purpose of this paper is to demonstrate the use

of fuzzy adaptive resonance theory (ART) for machine cell and part family formation problems in group technology. Fuzzy ART I belongs to the class of unsupervised, adaptive neural networks. A number of researchers have experimented with adaptive neural networks for cellular manufacturing, including Kao and Moon, ~ Caudell et al., 3 Malave and Rama- chandran, 4 Dagli and Huggahalli, 5 and Moon and Chi. n Dagli and Huggahalli used ART1 in such prob- lems, and Malave and Ramachandran used competi- tive learning. Fuzzy ART is the most recent adaptive resonance framework that provides a unified archi- tecture for both binary and continuous valued inputs. While fuzzy ART operations reduce to ART1 (which accepts only binary inputs) as a special case, para- meters available to fuzzy ART and its handling of nonbinary inputs make it function quite differently than ART1. We will demonstrate the application of fuzzy ART to part family formation problems and focus on mechanisms for improving and interpreting its performance in that domain. Several factors moti- vate the use of fuzzy ART, as follows:

1. By using one of the ART networks rather than the simpler competitive learning system, we exploit important stability properties of the network. Unlike competitive learning, when a new part or machine is added to the system, ART networks can continue to learn (without forgetting past learning) and incorporate new information. Also, the vigilance parameter of adaptive resonance introduces an adaptive characterization of the goals of the problem. It also aids in identifying significantly different parts or machines, while competitive learning alone cannot.

2. Fuzzy ART, unlike ART1,7 does not require a com- pletely binary representation of the parts to be grouped. While in this research we use only binary representations of parts, an important modification made to improve results relies on the ability to han- dle continuous inputs; however, fuzzy ART possess- es the same desirable stability properties as ART1 and a simpler architecture than that of ART2. 8

3. As Moore 9 showed, ART2--the version of ART for continuous valued inputs---can experience difficulty in achieving good categorizations if input patterns are not all normalized to constant length; however, such normalization can destroy valuable information. In addition, Dagli and Huggahalli s discovered a serious dependency of classification results in the case of ART1 on the sequence of input presentation. On closer inspec- tion, we determined that they have shown ART I can also experience the problem described by Moore, termed category proliferation. Carpenter, Grossberg, and Rosen I give methods for stem- ming category proliferation in fuzzy ART that are completely inappropriate in our domain, as we show. We provide a new method to improve per- formance of fuzzy ART for group technology.

148

Page 2: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14/No. 3

1995

The paper describes the application and ensuing modifications of fuzzy ART to the part family for- mation group technology problem. Section 2 pre- sents background information on part family forma- tion, and Section 3 briefly describes competitive learning based networks and their disadvantages. We give a detailed description of fuzzy ART in Section 4. Section 5 describes the application of neural net- works to group technology problems, the shortcom- ings reported, and the issues in evaluating. Results of application of fuzzy ART appear in Section 6, and we introduce a modification that improves perfor- mance of the original system in Section 7. This mod- ification, while straightforward, requires inputs to take on continuous values and thus exploits fuzzy ART's ability to handle such cases. Conclusions are given in Section 8.

2.0--Part Family Formation Problem Consumer trends and international competition

have forced manufacturers away from simple mater- ial flow lines for mass production and toward small- batch production. Cellular manufacturing has been accepted as an effective means for allowing small batches to achieve similar goals as flow lines? ° Cellular manufacturing is based on grouping a set of machines together as a cell such that a group of parts can be processed from start to finish within this cell. n The part family formation problem of group technology is the science of grouping parts and machines to accomplish cellular manufacturing.

In the part family formation problem, information is typically provided in the form of a part-machine matrix, A, with element [A0] indicating whether part i requires machine j (A0 = 1) or not (A U = 0). Alternatively, the transpose of the matrix, machine- part matrix A T, gives element [Art], which is a 1 if machinej is required by part i and 0 otherwise. The problem is then to find part families based on simi- larity of part design or manufacturing requirements. A single part family may possess a common set of machines that can provide the needed processing for all parts in the family. Similarly, the machine cell formation problem asks for machine cells within which a common set of parts can all be completely processed. Ideally, each part family will map to a unique machine cell, and the entire family need not ever leave the cell to complete all necessary pro- cessing. Practically, this may be either impossible or

computationally infeasible to achieve. In most cases, the goal is actually to minimize either intercell moves for parts, once machine cells have been deter- mined, or to minimize shared machines once part families have been determined.

Two basic techniques for part family formation are: (1) classification and coding Iz-14 and (2) cluster analysis. Rs Classification and coding methods are most useful for design information retrieval and rarely have the capability of finding part families for cellular manufacturing, n Variations of the method include visual and coding techniques. Clustering- based methods are the methods used for grouping parts and machines for cellular manufacturing. By clustering methods, we mean all those that operate on similarity of parts. Thus, if similarity of physical attributes of parts or end product affiliation is the basis for grouping, we consider the philosophical approach to be that of clustering.

Cluster analysis techniques group objects (parts or machines) into homogeneous clusters (groups) based on object features. Existing analytical clustering approaches to group technology can be classified as: (1) matrix-based methods; 16z° (2) mathematical pro- gramming; ~l"~s (3) graph theory based methods; 26a7 (4) pattern recognition techniques; z83° (5) fuzzy logic approaches; 31az (6) expert system based methods; ~s and (7) neural network based methods. 2"6'14

All of the above methods, with the exception of neural network based methods, are serial algorithms requiring significant time for processing. Moreover, these methods also require storage and manipulation of large matrices and virtually always focus on bina- ry attributes. Although neural networks, at present, usually require simulation via serial algorithms, their potential for hardware implementation means they ideally require significantly less storage and processing time than conventional approaches. Particularly for the ART family of neural networks applied here, ultimate hardware implementation is a real possibility, further widening the gap between processing speed of neural and conventional approaches.

3.0---Unsupervised Neural Network Approaches

For the clustering problem of part family for- mation, unsupervised neural network methods lend themselves naturally to a solution methodology.

149

Page 3: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14/No. 3 1995

Unsupervised means that no feedback regarding the network's response is allowed during training; hence, the network tends to cluster data on the basis of similarity. The paradigm of interest here is the fuzzy ART system. Previous neural network approaches to part family formation have primarily used competitive learning, ART1, interactive- activation, and backpropagation. Each of these net- works functions differently than fuzzy ART, but competitive learning may be considered a prede- cessor of ART networks and hence shares some fun- damental characteristics. Backpropagation is a supervised neural network paradigm and does not exploit the natural clustering aspect of the problem. We next describe the basic competitive learning net- work and advantages of using ART networks over competitive learning.

A simple representation of the structure for com- petitive learning appears in Figure 1. The primary purpose of the competitive learning network is to find a clustering of patterns from a selected training set. Thus, if the training set consists of P patterns and there exist Nclustering nodes, then up to N clus- ters will form for the set. In addition, the weight vec- tors provide information on the prototype pattern being stored for each cluster. When a clustering node, J, wins the competition by being "most simi- lar" to the input (as measured by the dot product between node J's weight vector, W j, and the input vector, X), its weight vector adapts to become "more similar" to the input, according to a fractional learn- ing rate, 13, as follows:

w 7 = w / " + 13 ( x - (1)

The learning rate is typically taken to be very small (13 ~ 1) or decreasing toward zero.

Assuming normalized inputs, the competitive learning structure can accept binary or continuous val- ues. If the application requires amplitude information, then the network typically becomes more complex than described above. Fuzzy ART, as we will see, can accept patterns that need not be normalized to identi- cal length, without increased structural complexity.

The simple form of competitive learning described above suffers from a number of inadequa- cies, in particular as perceived by Grossberg. 3J The system is not guaranteed to remain stable in the face of numerous inputs and continuous learning; rather,

_~!~ 2 N -3

Figure 1 Simplified Representation of Competitive Learning Neural Network

an artificially determined "critical period ''33 must exist where weights can adapt according to Eq. (1) and after which weights must be fixed. Most neural network models assume such a period. Even if the learning period is restricted, Grossberg notes that to guarantee a stable response the network requires a very small learning parameter, 13. ART1 and ART2 were Carpenter and Grossberg's 7,s response to this perceived inadequacy.

What truly distinguishes competitive learning from adaptive resonance structures is the latter's so- called vigilance parameter. The vigilance parameter induces reset of a node. Originally proposed by Grossberg in 1976, 33,34 the idea simply states that if a clustering node wins the competition for an input, X, and the similarity of the node's weight vector with X does not meet the vigilance parameter (a similarity measure), p, then the node is reset (turned off) and the pattern is sent to a new cluster. Adaptive resonance structures implement this mechanism in a physically realizable structure. 7,s The simple idea of vigilance allows the network to indicate that such a new input is significantly different from existing weight vectors, and it moreover contributes funda- mentally to the stability of the network. Competitive learning does not implement vigilance, so a signifi- cantly different input pattem can be forced into an inappropriate cluster.

ART2 generalizes the ART1 architecture to analog input patterns. Like ART1, both top-down and bot- tom-up weights play an integral role in competition and reset. Additionally, normalization and noise sup- pression functions are utilized heavily. This leads to a

150

Page 4: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal Of Manufacturing Systems Vol. 14/No. 3

1995

complete loss of amplitude information for analog patterns. Burke 3s addressed the problem by introduc- ing a scale variable that added a measure of amplitude as an input feature. While it is heuristic, the approach worked well in distinguishing patterns dissimilar in magnitude. In addition, the ART2 architecture requires a highly complex reset function. While the reset function of ART1 is similar in complexity to that of fuzzy ART (see Eq. (5) below), the reset function of ART2 is highly complex (see Carpenter and GrossbergS).

4.C Fuzzy ART Fuzzy ART j is an unsupervised category learning

and pattern recognition network. It incorporates computations from fuzzy set theory 36 into the ART based neural network. Fuzzy ART is capable of rapid stable clustering of analog or binary input patterns. A simplified representation appears in Figure 2.

The network consists of two layers, the input (F1) and the output (F2) layer. The number of possible categories (output nodes) can be chosen arbitrarily large. At first, each category is said to be uncom- mitted; a category becomes committed after being selected to code an input pattern. Each input,/ , is represented by an M-dimensional vector, I = (1~, ..., IM), where each component, Ii, takes the value in the interval [0,1 ]. One set of weight vectors, Wj -- (wit, ..., wjM), is used to represent each output category, j. Initially, wj~ = wj~ = ... = wjM = 1 for allj. tn the part family formation problem, an input vector corre- sponds to the processing requirement for a particu- lar part; hence, It - 1 if the part requires processing on machine i and It = 0 otherwise.

To categorize input patterns, the output nodes receive net input in the form of a choice function, Tj, as follows:

IIAWjl rj - - - (2)

a + IWy[

where/~ is the fuzzy MIN operator ~ def'med as follows:

( X / ~ Y)i -= min (x,, y~) (3)

and the norm I'/is defined as follows: M

i=1

(4)

Match

© © ©

©

©

Figure 2 Simplified Representation of Fuzzy ART Neural Network

The output node, J, with the highest value of Tj is the candidate to claim the current input pattern. These output nodes may be thought of as corresponding to a part family; hence, each element, wjk, corresponds to the need, in part family j, for machine k.

II ̂ Wjl > p (5)

III

For node J to code the pattem (to claim it and update its weight vector in response to it), the match function should exceed the vigilance parameter, that is:

Thus the choice function (to use Carpenter and Grossberg's 7,8 terminology) replaces the typical sim- ilarity functions used in most competitive learning based networks (including previous versions of ART). Rather than a weighted sum or Euclidean dis- tance, the choice function [Eq. (2)] extends the sub- sethood measure of fuzzy sets. 36,37

A neural network realization of such an operation is presented in Carpenter, Grossberg, and Rosen. 38 While we refer the reader to that paper for details, we note that the architecture requires the same number of connections as for a weighted summation transfer and suits a hardware implementation. This hardware implementability is an extremely important charac- teristic of the approach. The simulations conducted for this research cannot exploit this property, but they still demonstrate the speed of the system.

In the fast-learning mode, if the first candidate does not pass the similarity test, that node is "reset" by a mechanism described in detail in Carpenter and Grossberg. 7'8 We represent the reset mechanism in

151

Page 5: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14/No. 3 1995

F i g u r e 2 by a box labeled "Reset." Next, an uncom- mitted node should be committed to the input pat- tern. No other committed node can be more similar; therefore, it follows that only an uncommitted node can claim the pattern and satisfy the match criteri- on. 3s The weight vector of the winning node, J, is updated as follows:

W/e" = [3 ( I /~ W/t ' ) + (1 - 13) W j °ta (6)

Note that this equation can be rewritten in a form analogous to Eq. (1).

Fuzzy ART has three parameters: (1) the choice parameter, c~ > 0, which is suggested to be close to zero; (2) the learning parameter, 13E[0,1], which defines the degree to which the weight vector, Ws, is updated (recoded) with respect to an input vector claimed by node J; and (3) the vigilance parameter, pE[0,1], which defines the required level of simil- arity of patterns within clusters. In the fast-learning mode, Carpenter, Grossberg, and Rosen ~ suggest that 13 = 1. In the fast commit-slow recode mode, 13 = 1 for first-time commitments (fast learning/commitment) and 13 < 1 (slow recode) otherwise. We use the fast commit-slow recode option here. The Appendix gives a numerical example of the fuzzy ART algorithm.

4.1--Sealing Issues The careful reader will have observed that input

attributes for fuzzy ART must lie between 0 and 1, suggesting the need for normalization of patterns. We earlier cited fuzzy ART's ability to handle non- normalized input patterns. While the need for attribute values in the (0,1) interval seems contra- dictory, the typical approach to s c a l i n g data for neural networks accomplishes this requirement without destroying information on relative magni- tude of vectors. While the work reported here focus- es on processing route information of parts as the basis of similarity, leading to binary input vectors, our "add" heuristic that improves fuzzy ART's per- formance leads to nonbinary attributes. We describe the add heuristic in Section 7.

By scaling data, we mean simply that the mini- mum and maximum value for each attribute is found and used to linearly scale the data. Thus, the ith attribute, xi, of an M-dimensional pattern is scaled between its minimum value, mini, and its maximum value, max/, as follows:

1 -min i xi, ad j = x i ~ (7)

max / - min i max / - min i

where xi, aaj gives the scaled value. Scaling is used in virtually all neural network

applications to prevent saturation, and in this appli- cation it leads to useful properties. While r e l a t i v e

magnitude information is not lost between patterns, attribute values restricted to the (0,1) range lead to more precise characterizations of mechanisms such as vigilance.

4.2--Category Proliferation M o o r e 9 described a category proliferation prob-

lem that occurs in ART2 in the fast-learn mode. Because ART weights are monotonically nonin- creasing, the number of categories may proliferate, with each category consisting of only one or a very few patterns. If sparse input patterns (patterns with very few ones) appear early in processing, they can erode weight vectors and force patterns with higher norms into additional superfluous categories. For fuzzy ART in the fast-learn mode, Carpenter, Grossberg, and Rosen ~ propose normalization by complement coding as one method of overcoming this problem.

Although a straightforward normalization such as:

IN =--I (8) II

will succeed, such processing loses valuable ampli- tude information (Section 4.1). In an effort to save such information, complement coding transforms every M-dimensional input vector into a 2M vector by appending the M vector, I c, where:

Ii" = 1 - li (9)

and the input vector now becomes:

X - (I, I c) = (I,, ..., IM, I1C, ..., I~) (10)

5.0---Unsupervised Neural Networks for Part Family Formation in Group Technology

In this section, we describe two previous similar approaches to application of adaptive neural net- works to part family formation. Other unsupervised

152

Page 6: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14/No. 3

1995

approaches, which will not be treated here, have uti- lized primarily the interactive activation net- work, 2'6a9 and even some supervised approaches have surfaced. We focus on the particularly efficient unsupervised methods for the specific problems of part family and machine cell formation. Dagli and Huggahalli s used ART1 for part family formation, and Malave and Ramachandran 4 used competitive learning for part family/machine cell formation.

As Dagli and Huggahalli found, the category pro- liferation problem for ART1 can yield undesirable machine cell or part family group formation. As inputs, consisting of route information, are present- ed to the network, the weight vectors progressively diminish. This leads to the category proliferation problem described by Moore: 9 Fast-learn ART1 net- works, due to the monotonic nonincreasing nature of weights, can cause the number of categories created to proliferate and lead to numerous categories con- sisting of only a very few patterns. In a 7 (parts) × 5 (machines) example constructed by Dagli and Huggahalli, an adequate solution consisted of two groups. ART1, however, produced three groups, one consisting of only one pattern. The inferior grouping results from the combination of an input pattern pre- sentation sequence that has diminishing effect on the weight vectors in ART and from the choice of the vigilance parameter value. In addition, the binary nature of top-down weights contributes fundamen- tally to the problem.

Dagli and Huggahalli suggested heuristic methods to overcome the problem. In their first method, they suggested an alternate approach to the weight update of ART1. While this approach solved their small example, it was not clear how it affected the impor- tant stability properties of the network. It may, in fact, provide a useful alternative for the special cases arising in part family formation, but an in-depth analysis of this conjecture is needed. Their second solution was to present inputs in decreasing order of norm, where the norm is defined as the sum of the elements of the input vector. The approach also solved the problem because weights were not dimin- ished as severely in response to the first vectors seen.

While these heuristic measures appeared to work for the small example given, it is obviously prefer- able that the network itself, rather than the external user, solve the problem. The complement coding option described by Carpenter, Grossberg, and

Rosen ~ is their attempt to address the category pro- liferation problem. We found, however, that it is the fundamental nature of fuzzy ART, together with slow recode, that can best overcome the problem of category proliferation in group technology applica- tions. Our findings with and without complement coding using fuzzy ART appear in Section 6.

Malave and Ramachandran 4 applied simple com- petitive learning to part family formation problems. Their results showed that the approach could find the same desirable groupings given by previous conven- tional approaches for two problems from Chan and Milner ~° and King. s° In addition, they used the weight vectors produced by competitive learning to help determine machine cells, thus effecting "simultane- ous solution" of the part family/machine cell forma- tion problem. They cited the speed of competitive learning approaches compared to conventional meth- ods and suggested the usefulness of neural networks for assignments of new parts to machine cells after initial groupings have been made by the network.

As mentioned earlier, competitive learning may not be well suited to assigning new parts to estab- lished families. A primary danger is that the method will force an assignment, even if no appropriate family exists. A better technique would indicate the need for a new family for significantly different parts. In addition, increased complexity is needed to incorporate distance measures or loss of amplitude. Finally, competitive learning requires orders of mag- nitude more training cycles than ART due to forced slow learning (13 ~ 1).

6.0--Experimental Results Using Fuzzy ART

We next discuss the application of ART1 and fuzzy ART to five group technology problems. To assess performance of any method, we count the number of machine cells (part families) formed for the machine cell (part family) formation problem, the number of intercell moves for parts (number of critical or shared machines), and the maximum cell (family) size. These measures allow us to compare how well fuzzy ART compares to conventional as well as to previous neural network approaches, and the measures are standard in assessing new part fam- ily formation techniques.

These measures tell only part of the story, though. Because neural networks require far less time and

153

Page 7: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14/No. 3 1995

storage than their conventional counterparts, their performance must be considered with this in mind. Further, the notion of fast learning and fast commit- ment render the ART networks (including fuzzy ART) considerably faster than even simple competi- tive learning. Malave and Ramachandran 4 reported on the order of 100 iterations using competitive learning. With ART and fuzzy ART, fast learning or fast commitment enables the network to assign a sta- ble clustering after two to three iterations, which corresponds to well under 10 seconds on a 486- based PC for five of the six problems here. (The sixth problem is discussed separately.)

Finally, as discussed earlier, adaptive resonance architectures possess an important advantage over competitive learning networks in the form of the vigilance parameter. The results discussed below involve one-time clusterings; as Malave and Ramachandran noted, these networks can and should also be used for assigning new machines (parts) to established cells (families). As described earlier, competitive learning is poorly suited to reliable responses in such a case.

6.1--Results with Complement Coding Initially, we used complement coding as a means

of overcoming the anticipated category proliferation problem; that is, we normalized inputs by comple- ment coding and used fuzzy ART in the fast-learn mode (13 = 1, fixed). For this choice, and also for the fast commit-slow recode choice (13 < 1 after commit- ment), we found that complement coding actually exacerbated the category proliferation problem. In fact, more categories were formed with complement coding than without it. In addition, quite dissimilar inputs would join the same group due to an exagger- ated effect of input norm. For instance, in one case the following two patterns joined the same group:

I~ = {00000110000}; {11,1(} = {00000110000 11111001111}

Is= {00000001100}; {I~,12 c} = {00000001100 11111110011}

The inputs represent parts, and a 1 element in posi- t ionj indicates that machinej is required by the part. To our way of thinking, the two patterns have zero overlap; however, they have considerable overlap in their "off" or zero elements. This is the additional

consideration made by complement coding, and for the part family formation problem, it appears to weight too heavily the degree of overlap in the com- plement part of the processed pattern.

Unlike ART1 and ART2, the fuzzy ART network without complement coding can operate without normalizing inputs, and nonbinary weights can result. Thus, even without complement coding, fuzzy ART takes input norm into account, and our findings suggest that fuzzy ART with slow recode and without complement coding gives the best results for our problems.

6.2--Results Using Fuzzy ART We tested fuzzy ART on six examples from the lit-

erature that have helped identify promising new techniques. The inputs are the processing routes of parts on machines, represented by part for part fam- ily formation and by machine for machine cell for- mation. In Table 1 we describe our results for the first five problems and compare them to the results achieved with conventional approaches.

For all problems, the performance of fuzzy ART matched or exceeded that of other methods listed. Special note is needed in the case of Burbidge's problem, 42 which includes two machines (machines 6 and 8) that previous methods have had to remove from consideration because these machines process a large number of parts. Thus, results for all methods except ART1 and fuzzy ART do not include these machines. Note that fuzzy ART is the only method both able to process all data and obtain a good solu- tion. Problem 5 (Dagli and Huggahalli s) illustrates the input sequence dependency of ART1 that did not appear to hamper fuzzy ART. Overall, fuzzy ART delivered reliable and high-quality solutions. All solutions emerge after one or two cycles of process- ing (well under 10 seconds on a 486-based PC), so the method is extremely efficient. Note that, for binary problems, fuzzy ART and ART1 have funda- mental similarities. Still, fuzzy ART performed dif- ferently on problem 5 (which was contrived to show shortcomings of ART 1).

A sixth example was used to further demonstrate the efficacy of the approach, especially for a much larger problem size. Chandrasekharan and Raja- gopalan 27 introduced an algorithm they termed "ZODIAC: Zero One Data Ideal Seed Algorithm for Clustering" for part family formation. They solved a

154

Page 8: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14/No. 3

1995

Table 1 Summary ofExamples

No. of No. of Examples Machines Parts Approaches

Maximum No. of Intercell Machine

Machine Cells Moves Cell Size

1. Chan and 15 10 Milner 1°

2. Waghodekar and 5 7 Sahu ~

3. King z° 14 24

4. Burbidge 4z 16 43

5. Dagli and 5 7 Huggahalli s

(a) Chan and MilneP ° 3 0 5 (b) Malave and Ramachandran 4 3 0 5 (c) Ballakur and Steudel u 3 0 5 (d) ART1 3 0 5 (e) Fuzzy ART 3 0 5 (a) Waghodekar and Sahu 4° 2 2 3 (b) King and Nakornchai 4~ 2 3 3 (c) Ballakur and SteudeP 1 2 2 3 (d) ART1 2 2 3 (e) Fuzzy ART 2 2 3 (a) King z° 4 2 4 (b) Malave and Ramachandran 4 4 2 4 (c) ARTI 4 2 4 (d) Fuzzy ART 4 2 4 (Without machines 6 and 8) (a) Chan and Milner" 5 3 4 (b) Burbidge 4z 5 3 4 (c) Ballakur and Steudel" 5 3 4 (d) ARTI* 6 10 4 (e) Fuzzy ART* 5 3 4 (a) Dagli and Huggahalli 5. 3 2 4 (b) Dagli and HuggahalliSt 2 1 4 (c) ARTI* 3 2 4 (d) Fuzzy ART* 2 1 4

* Without changing order of inputs. t With changing order of inputs.

100 parts × 40 machines problem using their algo- rithm, which requires three phases to complete the part family formation problem. The problem solved was the largest found in the literature.

The 100 part × 40 machine incidence matrix is shown in Table 2. The block diagonal form of the solution obtained by Chandrasekharan and Rajago- palan appears in Table 3. By applying various values o f the network parameters, we obtained a set o f solu- tions. The 10-cluster solution, identical to the one found by Chandrasekharan and Rajagopalan, had the following performance measures: 36 shared machines and 37 intercellular movements; however, we also found a nine-cluster solution with 34 shared machines and 37 intercellular movements. The time required to process the problem using fuzzy ART was less than 1 minute on an RISC workstation.

7.0--Hewistic Framework for Using Fuzzy ART in Part Family Formation

While the results above appear quite promising, several shortcomings o f the original fuzzy ART

algorithm emerge. First, fuzzy ART can suffer from category proliferation as described earlier, and com- plement coding cannot be used for this problem. Second, practical cellular manufacturing issues dic- tate the need for a more flexible, user interactive approach to generating machine cells (groups). As in our sixth example, fuzzy ART can be used to gener- ate a number of different solutions, but only in a trial-and-error manner. Together with these issues, the difficulty o f using vigilance to control the num- ber of machine cells inspired development o f a more comprehensive approach.

Recall that the category proliferation problem occurs when an ART network of any type sees input patterns with small norms, causing weight values to decrease significantly early in processing. Later, when input patterns with larger norms appear, new categories are formed because weight values cannot increase. Complement coding solves the problem, but yields highly unsatisfactory results in the group tech- nology domain. Original fuzzy ART and fast learn/slow recode processing overcomes the problem

155

Page 9: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14/No. 3 1995

Table 2 lncidence MatrixforChandresekharan's Example

i

••••••••••22222222223333333333444444444455555555556666666666•777777777•8888888889999999999• •2345•789•12345•789••2345•789•1234•6789•12345•789••2345•7•9•12345•789•12•456789••23456•89•123456789•

1 11 1 1 1 I 11 11 2 11 1 1 1

3 11 1 1 1 1 1 1 1 1 1 11 4 1 1 11 1 5 1 1 1 1 1 1 1 1 6 I 1 1 11 l 1 1 1 1 11 1 7 11 1 1 1 1 1 1 1 1 1 8 1 1 11 1 1 1 1

1 11 1 1 1 1 9 1 10 1 1 1 1 1 1

11 1 1 11 1 1 1 1 1 1 1 12 Ii 1 Ii IIi ii 1 3 1 1 1 1 1 1

14 1 1 1 11 1 15 1 1 16 11 1 1 1 1

17 1 1 1 1 1 1 18 II I 19 1 1 20 I I 1 I i i 21 11 1 1 1 22 I I IIi 1 I

2 3 1 1 1 1 1 I 1 2 4 1 1 11 1 1 2 5 1 2 5 1 11 11 1 1 1 1 1 1 1 1 1 1

2 7 1 1 1 11 1 1 1 28 i I 29 1 1 I II I 1

3 0 I i 31 1 1 1 1

11 32 11 1 11 1 I 11 33 i I i 34 1 11 1 1 35 11 1 36 1 1 1 1 1 1 37 1 1 11111 1 3 8 I i I i I I I I i I I 3 9 1 1 1 1 1 4 0 i i 11 I 1 1 I I I I i

1 11 1

1 1

11 1 11 11 1 11 I

11 11 1 11 1 11 1

1 11 1 1 11

1 1 1

1 I ii I 1 1 1 1

1 1 11 1

1 1 1

11 11 11 1

1 1 1 i 111 11 i l i l i 1 1 11 1 l 11

1 1 1 11 11 l l

1 1 1 1

11 11 11 1 1 1

11 1 11 11 1 1 1

1 1 1

1 1 1 1 1 1

1 1 1 11 11 1 11 1

11 1 11 11 1 1 1 1 1

Table 3 Block Diagonal Form for Chandresekharan's Example

I 233455668115667779 1123450 1222456778888999 11233444456669 113456789225789233457778899 123345689

4594399785•1234•4378•3•9••8•0•7•78•59••2389389•2459•8•3459•2357•872679•7•225643•2••4•245•2865457387•

1 11111 IIIiii 3 11 1 1 1 1 1 1 1 1 1 1 7 1 1 1 1 1 1 1 1 11 1

32 I I I I I I I i i i i 2 10 16 21 31 4

9 20

5 8

22 23 37

39

6 12 26 38 40

11 13 14 17 35 15 18 33 34 36 19 25 28 30

24 27 29

1 t 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11 1 1 1 1 1 1 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11 11

1 1 1 1 1 1 1 1 1 1 1 1 1 1 11 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

1 1

1

1

1 1 1

I i i

1

1

11 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

I I I 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

1 1

Shared m a c h i n e s = 36 Intercellular m o v e m e n t s = 37

1

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

11 11 1 1 1 1 1 I 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1111111

1111111111 1111111111 1111111111

156

Page 10: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14/No. 3

1995

(at least in Example 5, fuzzy ART under such condi- tions did not experience the problem as ART 1 did).

While we found that, for group technology prob- lems, fuzzy ART did not experience category prolif- eration, the same mechanism that causes category proliferation can hamper fuzzy ART's ability to pro- vide good solutions. A fortuitous choice for vigi- lance parameter, p, plays a large role in generating good solutions. To generate alternative solutions, the user must turn to ad hoc methods of tuning vigilance (which are unreliable).

We propose here a systematic approach for addressing the problems described above. The "add" method allows a user to construct a hierarchy of solutions. For example, a user may wish to form a limited number of machine cells due to extemal con- straints. The tradeoff between the value of the vigi- lance parameter and the number of cells formed can make it difficult to obtain a good grouping under such conditions. Our approach is to use fuzzy ART to generate a hierarchy of alternative clusterings from which the best can be chosen.

To yield such solutions, we propose a heuristic framework for using fuzzy ART in which inputs are progressively merged until the fewest number of cells that can be formed or that are desired is reached. Let N* represent the minimum desired number of cells. The algorithm for this method, which we call the add method, is as follows:

0. Set k = 0. 1. For vigilance p,, group inputs by fuzzy ART.

Call the resulting number of groups n, and the elements of group i {ejg}.

2. Form n, new patterns by using vector addition to form element Eg = vector sum (e~3. Scale inputs.

3. If either nk -< N* or nk = 1, stop. Otherwise, set k = k + 1, select vigilance P, < P~ and/or select [3k < [3~1 and go to 1. Notice that if we return to step 1, we use a new fuzzy ART network in which wjl = ... = wj,. = 0.

Note that the merging and rescaling of inputs leads to nonbinary values for input elements. While the idea described above is simple, it could not be implemented on an ART1 network (because merging and rescaling leads to continuous feature values) nor implemented effectively on an ART2 network (due to loss of relative magnitude information).

This method yields a hierarchy of possible solu- tions corresponding to varying numbers of groups. Because the optimal number of such groups remains a variable requiring determination, the generation of a list of possible solutions with different numbers of groups is desirable. For example, we use the add method on King's 2° problem with 14 machines and 24 parts. Initially, eight machines result. For the sec- ond round, input vectors are merged, and fuzzy ART inputs become continuous rather than binary. This round produces only six machine cells. In the final round, four cells are produced, and the solution is identical to King's. For the various solutions, trade- offs exist between the number of shared machines and intercell movements and the number of cells. Thus, the hierarchy generated may be useful for investigating alternative solutions.

8.0 Conclusion The neural network clustering technique presented

in this paper is a general-purpose technique, applied here to the part family formation problem in cellular manufacturing. The technique could also be applied to other group technology domains, such as design, service, sales, and purchasing. The key point is that for each application, similarity attributes should be defined based on the application's requirements.

Neural network approaches to group technology have solved, to varying degrees, problems of speed and flexibility. The fuzzy ART approach virtually always offers solutions within 10 cycles (usually two to three cycles) of processing (a matter of seconds on a 486-based PC), it offers flexibility in terms of its ability to handle a variety of attribute representations, and its incorporation of the vigilance parameter endows it with both highly attractive stability proper- ties and a practical ability to reliably group new parts (or indicate their unsuitability for existing groups).

Fuzzy ART performs comparably to a number of other methods in the literature. Our heuristic frame- work for automating the use of fuzzy ART, the add method, relieves it of sequence dependency and improves its ability to generate high-quality solu- tions for varying numbers of cells or families.

Additional issues that future work will address include the interpretation and utilization of the weight vectors yielded by fuzzy ART and characterization of the objective driving the fuzzy ART neural network for group technology and the resultant evaluation

157

Page 11: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14fNo. 3 1995

function that may be used to select a solution from among several. Also, fuzzy ART's ability to handle nonbinary values will be explored in application to new kinds o f mixed group technology problems.

Append ix A - - F u z z y ART E x a m ple Presented in sequential order will be four inputs,

as follows:

It = 10110 12 = 01011 13 = 10100 I4 = 01001

Initially, all nodes are uncommitted. All uncommit- ted nodes, i, have weight vectors, W; = 11111. Thus, initially only node 1 will be considered. Then, as node j becomes committed, node (j + 1) represents all uncommitted weight vectors. Initial 13o = 1; sub- sequent values 13r = 0.5.

Cycle 1

Presentation 1:(11 = 10110) Ti = (I I~ A W , I) / (IWll + ~) -- 3/5.01 Vigilance check: (lll/kWl]) / Illl -- 3/3 = 1 > p Node 1 becomes committed. Because 13 -- 1 ini- tially, n e w W 1 = 1 0 1 10.

Presentation 2:(I2 = 01011) T1 = 1/3.01 7"2 = 3/5.01 Node 2 "wins" (T2 > Tt). Vigilance check: 3/3 = 1 > p Node 2 becomes committed. W2

Presentation 3:(13 = 10100) T1 = 2/3.01 T2 = 0/3.01 T3 = 2/5.01 Node 1 "wins." Vigilance check: 2/2 > p Weight update leads to new W~

Presentation 4:(14 = 01001) Ti = 0/3.01 T2 = 2/3.01 7"3 = 2/5.01 Node 2 "wins." Vigilance check: 2/2 > p Weight update leads to new W2

= 0 1 0 1 1 .

= 1 0 1 . 5 0

= 0 1 0 . 5 1

Cycle 2

Presentation 1: (It = 1011 O) 7"1 = 2.5/2.51 T2 = 0.5/2.51 T3 = 3/5.01 Node 1 "wins." Vigilance check: 2.5/3 > p Wt will not change.

Presentation 2:(12 = 01011) TI = 0.5/2.51 T2 = 2.5/2.51 T 3 = 3/5.01 Node 2 "wins." Vigilance: 2.5/3 > p W2 will not change.

Presentation 3:(13 = 10100) T1 = 2/2.51 T2 = 0/2.51 T3 = 2/5.01 Node 1 "wins." Vigilance: 2/2 > p Weight update leads to new Wl = 1 0 1 .25 0

Presentation 4:(14 = 01001) Tl = 0/2.51 T2 = 2/2.51 T3 = 2/5.01 Node 2 "wins." Vigilance: 2/2 > P Weight update leads to new W2 = 0 1 0 .25 1

Note that if p --- 0.9, vigilance checks in cycle 2, pre- sentations 1 and 2, would lead to reset and identifi- cation o f new uncommitted nodes.

A c k n o w l e d g m e n t s This research was supported in part by NSF grant

ECS-9110846 and by an NSF National Young Investigator Award.

References 1. G.A. Carpenter, S. Grossberg, and D.B. Rosen, "Fuzzy Art: Fast

Stable Learning and Categorization of Analog Patterns by an Adaptive Resonance System," Neural Networks (v4, 1991), pp759-771.

2. Y. Kao and Y.B. Moon, "A Unified Group Technology Implementation Using the Backpropagation Learning Rule of Neural Networks," Computers & Industrial Engineering (v20, n4, 1991 ), pp425- 437.

3. T.R Caudell, D.G. Smith, G.C. Johnson, and D.C. Wunsch II, "An Application of Neural Networks to Group Technology," SPIE (v1469, Applications of Artificial Neural Networks II, 1991 ), pp612-621.

158

Page 12: Neural networks and the part family/ machine group formation problem in cellular manufacturing: A framework using fuzzy ART

Journal of Manufacturing Systems Vol. 14/No. 3

1995

4. C.O. Malave and S. Ramachandran, "Neural Network-Based Design of Cellular Manufacturing Systems," Journal oflntelligent Manufacturing (v2, 1991), pp305-314.

5. C. Dagli and R. Huggahalli, "Neural Network Approach to Group Technology," Knowledge-Based Systems and Neural Networks: Techniques and Applications (New York: Elsevier, 1991), pp213-228.

6. Y.B. Moon and S.C. Chi, "Generalized Part Family Formation Using Neural Network Techniques," Journal of Manufacturing Systems (vl 1, n3, 1992), pp149-159.

7. G.A. Carpenter and S. Grossberg, "A Massively Parallel Architecture for a Self-Organizing Neural Pattern Recognition Machine," Computer Vision, Graphics, and Image Processing (v37, 1987), pp54-115.

8. G.A. Carpenter and S. Grossberg, "ART2: Self-Organization of Stable Category Recognition Codes for Analog Input Patterns," Applied Optics (v26, 1987), pp4919-4930.

9. B. Moore, "ARTI and Pattern Clustering," Proceedings of the 1988 Connectionist Models Summer School, D. Touretzky, G. Hinton, and T. Sejnowski, eds. (San Mateo, CA: Morgan Kaufmann Publishers, 1989). 10. H.M. Chan and D.A. Milner, "Direct Clustering Algorithm for Group Formation in Cellular Manufacturing," Journal of Manufacturing Systems (vl, nl, 1982), pp65-74. 11. m. Ballakur and H.J. Steudel, " A Within-Cell Utilization Based Heuristic for Designing Cellular Manufacturing Systems," International Journal of Production Research (v25, n5, 1987), pp639-665. 12. J.A. Keus, C.P. Rome, and G.J. Van Zoelen, "Implementation of the Group Technology Concept for the Manufacture of 544 Machine Parts for Electro-Mechanical Products with the Aid of the MICLASS-Package," CIRP Manufacturing Systems (v6, 1977), p167. 13. M.P. Groover, Automation, Production Systems, and Computer- Integrated Manufacturing (Englewood Cliffs, N J: Prentice-Hall, 1987). 14. S. Kaparthi and N.C. Suresh, "Machine-Component Cell Formation in Group Technology: A Neural Network Approach" International Journal of Production Research (v30, n6, 1992), pp1353-1367. 15. A. Kusiak, Intelligent Manufacturing Systems (Englewood Cliffs, N J: Prentice-Hall, 1990). 16. J. McAuley, "Machine Grouping for Efficient Production," The Production Engineer (Feb. 1972), pp53-57. 17. H. Seifoddini and P.M. Wolfe, "Application of the Similarity Coefficient Method in Group Technology," liE Transactions (vl8, n13, 1986), pp271-277. 18. J. Dewitte, "The Use of Similarity Coefficients in Production Flow Analysis," International Journal of Production Research (vl8, n4, 1980), pp503-514. 19. W.T. McCormick, P.J. Schweitzer, and T.W. White, "Problem Decomposition and Data Reorganization by Cluster Technique," Operations Research (v20, n5, 1972), pp993-1009. 20. J.R. King, "Machine-Component Group Formation in Production Flow Analysis: An Approach Using a Rank Order Clustering Algorithm," International Journal of Production Research (vl8, n2, 1980), pp213-232. 21. A. Kusiak and W.S. Chow, "An Algorithm for Cluster Identification," IEEE Transactions on Systems, Man and Cybernetics (vSMC-17, n4, 1987), pp696-699. 22. A. Kusiak and W.S. Chow, "Efficient Solving of the Group Technology Problem," Journal of Manufacturing Systems (v6, n2, 1987), pp117-124. 23. A. Kusiak, A. Vannelli, and K.R. Kumar, "Clustering Analysis: Models and Algorithms," Control and Cybernetics (v 15, n2, 1986), pp 139- 154. 24. G. Srinivasan, T.T. Narendran, and B. Mahadevan, "An Assignment Model for the Part-Families Problem in Group Technology," International Journal of Production Research (v28, nl, 1990), pp 145-152. 25. R.G. Askin and K.S. Chiu, "A Graph Partitioning Procedure for Machine Assignment and Cell Formation in Group Technology," Inter- national Journal of Production Research (v28, n8, 1990), pp1555-1572.

26. R. Rajagopalan and J.L. Batra, "Design of Cellular Production Systems: A Graph Theoretic Approach," International Journal of Production Research (v13, 1975), p567. 27. M.P. Chandrasekharan and R. Rajagopalan, "An Ideal Seed Non- Hierarchical Clustering Algorithm for Cellular Manufacturing," International Journal of Production Research (v24, n2, 1986), pp451-464. 28. J. Peklenik, J. Grum, and B. Logar, "An Integrated Approach to CAD/CAPP/CAM and Group Technology by Pattern Recognition," 16th CIRP International Seminar on Manufacturing Systems (Tokyo: July 1984). 29. B. Logar and J. Peklenik, "Computer-Aided Selection of Reference Parts for GT-Part Families," 19th CIRP Manufacturing Systems Seminar (University Park, PA: Pennsylvania State University, June-July 1987). 30. B. Mutel, H. Garcia, and J.M. Proth, "Automatic Classification of Production Data," 18th CIRP Manufacturing Systems Seminar (Stuttgart, Germany: June 1986). 31. J. Li, Z. Ding, and W. Lei, "Fuzzy Cluster Analysis and Fuzzy Recognition Methods for Formation of Part Families," Proceedings of the North American Manufacturing Research Institution of SME (Vol. XIV, 1986). 32. D. Ben-Arieh and E. Triantaphyllou, "'Quantifying Data for Group Technology with Weighted Fuzzy Features," International Journal of Production Research (v30, n6, 1992), pp1285-1299. 33. S. Grossberg, "Adaptive Pattern Classification and Universal Recoding, I: Parallel Development and Coding of Neural Feature Detectors" Biological Cybernetics (v23, 1976), pp 121-134. 34. S. Grossberg, "Adaptive Pattern Classification and Universal Recoding, II: Feedback, Expectation, Olfaction, and Illusions," Biological Cybernetics (v23, 1976), pp 187-202. 35. L.I. Burke, "Automated Identification of Tool Wear States in Machining Processes: An Application of Self-Organizing Neural Networks," PhD Thesis (Berkeley, CA: University of California-Berkeley, 1989). 36. L. Zadeh, "Fuzzy Sets," Information and Control (v8, 1965), pp338- 353. 37. B. Kosko, Neural Networks and Fuzzy Systems: A Dynamical Systems Approach to Machine Intelligence (Englewood Cliffs, N J: Prentice-Hall, 1992). 38. G.A. Carpenter, S. Grossberg, and D.B. Rosen, "A Neural Network Realization of Fuzzy ART," Technical Report CAS/CNS-91-021 (Boston: Boston University, 1991). 39. Y.B. Moon, "An Interactive Activation and Competition Model for Machine-Part Family Formation in Group Technology" International Joint Conference on Neural Networks (v2) (Washington, DC: Jan. 1990), pp667- 670. 40. P.H. Waghodekar and S. Sahu, "Machine Component Cell Formation in Group Technology: MACE," International Journal of Production Research (v22, 1984), pp937-948. 41. J.R. King and V. Nakornchai, "Machine-Component Group Formation in Group Technology," International Journal of Production Research (v20, 1982), pp117-133. 42. J.L. Burbidge, The Introduction of Group Technology (New York: John Wiley & Sons, Halsted Press, 1975).

Authors' Biographies Laura Burke is an associate professor of industrial and manufacturing

systems engineering at Lehigh University. She received her PhD from the University of California-Berkeley in 1989. Her research interests include neural networks, manufacturing systems, and logistics.

Soheyla Kamal received her MS in 1987 from Louisiana State University-Baton Rouge and her PhD from Lehigh University in 1993, both in industrial engineering. Her area of interest is the application of neural networks in manufacturing engineering.

159