six sigma vs taguchi

Upload: emykosm

Post on 06-Apr-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/3/2019 Six Sigma vs Taguchi

    1/14

    www.calidad-seis-sigma.com

    Six Sigma Quality Metric vs. Taguchi Loss

    Function

    Luis Arimany de Pablos Ph.D.

    www.calidad-seis-sigma.com

    Spain

    Summary

    The aim of this paper is to present the Genichi Taguchi Metric (LossFunction) as compared to the Six Sigma Metric (six sigma = 3.4 defectsper million, five sigma = 233 defects per million, four sigma = 6, 210

    defects per million, and so on).

    On comparing the two metrics it seems that from the point of view ofSociety (customers, partners, stockholders, suppliers, employees, etc.) acentred process working with four sigma may very well be preferred to anout of the target (m + 1.5 sigma) process working with six sigma, even

    if the latter produces 3.4 defective per million and the former 32defective per million (almost 10 tenfold).

    The six sigma assumption of a 1.5 shift in the mean of the process isinvestigated through the reasons given in the literature and the statement

    that says: Since Control Charts will easily detect any process shift of

    this size (1.5 ) in a single sample, the 3.4 PPM represents a veryconservative upper bound on the non-conformance rate is criticallyevaluated.

    The question remains in the air as to whether the effort of going from 4

    to 6 , under these circumstances, is worth the cost that encompasses.

    Introduction

  • 8/3/2019 Six Sigma vs Taguchi

    2/14

    www.calidad-seis-sigma.com

    2

    The term 6 sigma was coined by Motorola Corp. in the early eighties,when Motorolas CEO at the time, Bob Galvin started the Companyalong the quality path. After Motorolas director of quality Richard

    Buetow and other company executives made a trip to Japan the company

    adopted the program 6 sigma that reduced defects by 99,7% and saved thecompany US $ 11 billion from 1987 to 1996.

    In 1998, Motorola Corp. became one of the first companies to receive theMalcolm Baldrige National Quality Award.

    Dr. Joseph Juran, world-renowned quality guru, will present theAmerican Society for Qualitys newest award, the Juran Medal, at the

    ASQ Annual Bussines Meeting, May 6, 2001, to Robert W. Galvin at theCharlotte (NC) Convention Centre. The Juran Medal recognizes thosewho exhibit distinguished performance in a sustained role as an

    organizational leader, personally practicing the key principles of qualityand demonstrating breakthrough management.

    Citibank, the international financial division of Citicorp, undertook thesix sigma method in the spring of 1997. Its goal: to reduce defects withinits various divisions by a factor of 10 during the first three years. The

    corporation has already seen reductions from five to 10 times. GeneralElectric, which launched a six sigma initiative in late 1995 says the $ 300million invested in quality improvements in 1997 will deliver some $ 400

    to $ 500 million in savings.

    Besides Motorola and G.E., Dupont, Black&Decker, Wipro Corp. and

    many other Companies claim that using six sigma methodology 3.4defects per million opportunities can be obtained not perfection, butalmost.

    The Six Sigma Metric

    In classical (Shewhart) Statistical Process Control (SPC), control chartsdetermine if the process is in state of statistical control, statisticaluniform, or stable. They do not tell us if the process is meeting

    specifications and producing good products. It is necessary to get theprocess in control and within specifications.

    A product (or service) is said to be defective if i t l ies outside thespecification interval, x > upper specification limit (USL) or x < lowerspecification limit (LSL).

    Let us assume that the target value for the mean of the process is at themidpoint of the latter [m=(USL-LSL)/2]. By the control chart we

  • 8/3/2019 Six Sigma vs Taguchi

    3/14

    www.calidad-seis-sigma.com

    3

    determine the standard deviation of the process and we define PROCESSCAPABILITY as 6 . This is a measure of the repeatability of the processand is commonly called the 6-sigma range for individuals.

    We assume that the distribution of our quality variable is normal and,accordingly, its values will lie inside the interval m 3 the 99.73 % of

    the time, and if we set the specification limits at m 3 we will get on

    average 0.27 % of detective values or 2.7 per thousand or 2,700 defectsper million. If one considers only one tail , there will be 1,350 ppmfailures.

    The six sigma method states that, under the above conditions, to obtain1,350 defective per million is rather high. Thus, we should have avariation so that in order to be within the same specification limits as

    before the natural tolerances of the process must be set at m 6

    (herecome the name Six Sigma). This is to work with a 12-sigma range forindividuals. If this is the case, the process will produce 0.00198 defective

    per million, approximately, 2 defective per billion, or 0.001 defectivesper million or one defective per billion if you consider only one tail .

    But this is not what six sigma people tell us. They claim that workingwith the 6 methodology you get 3.4 defective per million. How canthis be, if the exact figure is 0.002 ppm (or 0,001 ppm, if we consider

    only one tail)?

    Once the specification limits are set and the process is under control andthe target value is at the midpoint of the specification interval one can

    calculate an index, Cp or Potential Capability Index defined as:

    6..

    ..

    LSLUSL

    capabilityprocess

    rangeionspecificatCp

    ==

    A process is said to be capable ifCp====1 . The greater the Cp , the better

    the process meets specifications, if process mean is at target value m. Cp not only tells us if the process spread is small enough to allow us to meet

    specifications, i t also tells us by what amount (what factor) our processquality has the potential ability to show excellence beyond the minimumspecification requirements.

    We have already seen that the Six Sigma Method tries to reduce variationof the process, so that, the specification limits are at m 6 . Cp will be

    in, this case:

  • 8/3/2019 Six Sigma vs Taguchi

    4/14

    www.calidad-seis-sigma.com

    4

    26

    12==

    Cp

    In the table below we can see the six sigma metric and its one-to-onecorrespondence between Cp index and the right hand tail probability

    Table 1: Cp and ppm defective

    Process Centred at Target

    Process Cp LSL USLRight hand ppm

    defective

    1

    2

    3

    4

    5

    6

    158,655

    22,750

    1,350

    31.686

    0.287

    0.001

    0.33

    0.66

    1

    1.33

    1.66

    2

    m-1 m+1

    m-22 m+22

    m-33 m+33

    m-44 m+44

    m-55 m+55

    m-66 m+66

    We have to note that Cp only addresses the spread of the process. It only

    gives an indication as to whether or not the process is potentially capableof meeting specifications. It does not give any indication as to whether or

    not the process actually meets specifications.

    If the mean of the process, let us call it x , is not at the target value m,

    the process may not be satisfactory because it is making product or

    providing services beyond the specification limits. Cp does not reflect

    this. Cp only reflects whether or not the process variation would beacceptable for a perfectly controlled process. Hence, another index is

    needed to describe how well the process has demonstrated conformity tospecifications and to tell us how well the process has narrowed aroundthe midpoint of the specifications.

  • 8/3/2019 Six Sigma vs Taguchi

    5/14

    www.calidad-seis-sigma.com

    5

    The process capability index that accomplishes it is called Cpk . If Cpk isequal or larger than 1, the closest specification limit is far enough fromthe process centre so that very few products are being made beyond

    specifications. Cpk is defined as:

    3

    xUSLCpk

    = ,

    Where,

    x = process mean

    3 = half process capability.USL= Upper Specification Limit

    Cpk is an index that measures how narrow the process spread is,

    compared to the specifications spread tempered by how well the processcentres around the midpoint of the specification interval, given that thetarget value is at this midpoint.

    To simplify the exposition let us assume that the process mean x has

    shifted from the target value m, to, say to the right, a distance of 1.5 (

    which is what six sigma quality experts consider). The Cpk index andppm defective are shown in the table below:

    Table 2: Cpk and ppm defective

    Process Centred at m + 1.5

    Process Cpk USLRight hand ppm

    defective

    1

    2

    3

    4

    5

    6

    691,464

    308,536

    66,807

    6,209.66

    232.67

    3.4

    -0.166

    0.166

    0.5

    0.83

    1.166

    1.5

    m+1 -0.51

    m+22 0.5

    m+33 1.5

    m+44 2.5

    m+55 3.5

    m+66 4.5

    Z score

  • 8/3/2019 Six Sigma vs Taguchi

    6/14

    www.calidad-seis-sigma.com

    6

    Here comes the magic number of 3.4 defective per million. When wework with the six sigma methodology, even if the process mean shifts, letus say to the right, an amount of 1.5 , the number of defective products

    that we make are only 3.4 per million. So, we can establish a one-to-one

    correspondence between Cpk index and the ppm derived from the sixsigma methodology. To work with six sigma is to work with a value of

    Cpk of 1.5.

    The reasoning behind the method is as follows: in real l ife, even if aprocess is under control it is not infrequent to see that the process mean

    moves up (down) to target mean plus (minus) 1.5 . If this is the case,the worst case, working with the six sigma philosophy will guaranteethat we will not get more than 3.4 defective per million products or

    services out of specifications.

    The results so far obtained can be summarised as follows:

    Table 3: Cp, Cpk an d pp m defe ct iv e

    Process Centred at m + 1.5

    ProcessRight hand ppm

    defective

    1

    2

    3

    4

    5

    6

    691,464

    308,536

    66,807

    6,209.66

    232.67

    3.4

    Process Centred at m

    Cpk

    -0.166

    0.166

    0.5

    0.83

    1.166

    1.5

    Right hand ppm

    defectiveCp

    0.33

    0.66

    1

    1.33

    1.66

    2

    158,655

    22,750

    1,350

    31.69

    0.287

    0.001

  • 8/3/2019 Six Sigma vs Taguchi

    7/14

    www.calidad-seis-sigma.com

    7

    The 6 sigma Metric vs. Taguchi Loss function

    Dr. Genichi Taguchis concept of quality can be stated as the loss that aproduct o service produces to society in its production, transportation,

    consumption or use, and disposal. The lower the losses to societyproduced by a product or service, the higher the quality of it will be. Hedeveloped a method to forecast and measure quality, in economic terms,

    under the assumption that tolerance limits are correct. The method callsfor calculating the Quality or Loss function of a process.

    Historical ly, among other means of measuring quality, the percentage ofdefectives, the above mentioned Process Capability Indices and theWarranty Costs have been used. Of the three ways, the second ones are

    the most abstract and difficult to interpret. For instance, what is the real

    improvement of going from a Cp of 0.9 to one of 1.2?. On the other hand,the other two ways are more intuitive in nature: we can measure the

    percentage of defective or the warranty costs in terms of money, and anyperson can asses the result of a change in the process or any othermeasure that affects quality. Nonetheless, the warranty costs, even being

    very valuable, are not useful for taking immediate measures, due to thelarge time lag which is involved. A similar comment can be applied to thepercentage of defectives of already manufactured products.

    What we need are appropriate methods that could forecast quality beforethe product has been shipped or during production and that could

    measure this quality in monetary terms. Thus, Dr. Taguchi, through theLoss Function, introduces a monetary products quality assessment (underthe assumption that tolerances are correct). By means of this function,even Cp or Cpk, that previously were difficult to interpret, can have an

    instant monetary reading. It is a classic example in the literature, the caseof the manufacturing of TV sets, by the same Company, in the US and inJapan and how consumers prefer the Japanese ones because of their better

    quality, even though, they were producing a higher percentage ofdefectives.

    Dr. Taguchi argues that even if the product complies with the tolerancesor specification limits, if i t is not at the target value, a loss to societythere exists. The loss function can be well approximated (using the

    Taylor expansion) by a quadratic function such as :

    ( )2mxkL = ,

  • 8/3/2019 Six Sigma vs Taguchi

    8/14

    www.calidad-seis-sigma.com

    8

    where k depends on the loss to society in the point where the variable just exceeds the tolerances. The quality or loss function is then measuredin monetary terms and its expected value is:

    ( ) 22 kmxEkEL ==

    Thus, the expected loss is proportional to the variance.

    If the mean of the process is at the target value m, we can establish a

    one-to one correspondence between Cp, ppm. Defective and LossFunction.

    Table 4: Loss Function for a centred process

    Loss Function(Process Centred at Target)

    Six Sigma

    MetricCp

    R H ppm

    defective

    1

    23

    4

    5

    6

    158,655

    22,7501,350

    31.686

    0.287

    0.001

    0.33

    0.661

    1.33

    1.66

    2

    Loss Function

    3

    1.5

    0.75

    0.6

    0.5

    Standard

    Deviation

    9k2

    2.25k2

    1k2

    0.56k2

    0.36k2

    0.25k2

    If the mean of the process is not at the target value m, but at m+1.5 , the

    Cp ratio is not the one to use but Cpk, and the expected value of the loss

    is no longer k . Statisticians will tell us that now E(x-m) is equal to themean squared error, namely:

    ( ) ( ) ( ) 22222222 25.325.25.1 =+=+=+= biasmxE

    The figures come up now as:

  • 8/3/2019 Six Sigma vs Taguchi

    9/14

    www.calidad-seis-sigma.com

    9

    Table 5: Loss Function for a non centred process

    Loss Function(Process Centred at m+1.5)

    Six Sigma

    MetricCpk

    R H ppm

    defective

    1

    23

    4

    5

    6

    691,464

    308,536

    66,807

    6,209.66

    232.67

    3.4

    -0.16

    0.16

    0.5

    0.83

    1.16

    1.5

    Loss Function

    3

    1.5

    0.75

    0.6

    0.5

    Standard

    Deviation

    29.25k2

    7.3125k2

    3.25k2

    1.8281k2

    1.17k2

    0.8125k2

    On stressing the focus only on the ppm defective the Six Sigma

    Methodology may be not taking into account all the richness andsatisfactions for customers that the Taguchis Loss Function implies.

    In fact, concentrating all the efforts on the variance of the process andnot so much on the centring of the process it seems one is loosingopportunities for excellence.

    Thus Taguchis Loss Function will tell us that, from the point of view ofsociety (customers, partners, stock holders, suppliers, employees, etc.), a

    centred process working with four sigma 256.0 KL = may be very well

    preferred to an out-of-target process (mean = m+1.5) working with six

    sigma [ ]28125.0 KL = .

  • 8/3/2019 Six Sigma vs Taguchi

    10/14

    www.calidad-seis-sigma.com

    10

    Table 6: Loss Function Comparison

    Six Sigma

    MetricCpk

    1

    2

    3

    4

    5

    6

    -0.16

    0.16

    0.5

    0.83

    1.16

    1.5

    Loss

    Function

    (Process Centred at

    m+1.5)

    29.25k2

    7.3125k2

    3.25k2

    1.8281k2

    1.17k2

    0.8125k2

    Loss

    Function

    (Process Centred at

    m)

    9k2

    2.25k2

    1k2

    0.56k2

    0.36k2

    0.25k2

    Cp

    0.33

    0.66

    1

    1.33

    1.66

    2

    Table 7: PPM Defective Comparison

    Six Sigma

    MetricCpk

    1

    2

    3

    4

    5

    6

    -0.16

    0.16

    0.5

    0.83

    1.16

    1.5

    R H ppm

    defective

    (Process Centred at

    m+1.5)

    R H ppm

    defective

    (Process Centred at

    m)Cp

    0.33

    0.66

    1

    1.33

    1.66

    2

    158,655

    22,750

    1,350

    31.686

    0.287

    0.001

    691,464

    308,536

    66,807

    6,209.66

    232.67

    3.4

  • 8/3/2019 Six Sigma vs Taguchi

    11/14

    www.calidad-seis-sigma.com

    11

    Then, if a centred 4 process is better than an out-of-the-target 6 one, is

    the effort of going from 4 to 6 worth the cost that it encompasses?

    Why do the 6 advocates not control the mean? Why have the companies

    being forced to make the extraordinary and expensive effort of going

    from 4 to 6 , when controlling the mean on target seems to be muchless expensive than reducing the variance so much?.

    The reasons are not so clear in the literature:

    Pyzdek (2000) states: Since Control Charts will easily detect anyprocess shift of this magnitude in a single sample, the 3.4 PPM representsa very conservative upper bound on the non conformance rate.

    Noguera and Nielsen (1993) state: Motorola choice of 1.5 was based onboth theoretically and practical grounds. Theoretically, an X bar control

    chart will not quickly detect a process shift until the magnitude of the

    shift is +/-1.5 (based upon a sample size of n=4).

    Harry and Schroeder (2000) claim: The average time-to-time centring

    error for a typical process will average about 1.5.Years oftheoretical and empirical research on this subject have proven this to be

    true. This amount of shift and drift is inevitable, and has to be accountedfor during the design cycle of the process, product, or service.

    The point is: if we have a centred 4 process, can we detect or not a shift

    of the mean of 1.5 in a reasonable time period?, and if we can detect it,we most likely can correct it. If this is the case, it seems that it is not

    worth to going on to reduce the variance and to reach a 6 process, asbringing the mean on target may very well be less expensive than

    reducing the variance. And, most important, we may question the

    inevitab ility of 1.5 shift in the long term, because we can detect andcorrect this after few SPC samples.

    To clarify this point, let us assume that we have an SPC scheme based on

    samples of size n=4, that process mean and are known, that processmean is at the target value m, which is at the mid-point of the

    specification interval, and that we allow the process mean to shift 1.5 .

    Let us also consider that we apply this control to a 3, 4 , 5 and 6

    process.

  • 8/3/2019 Six Sigma vs Taguchi

    12/14

    www.calidad-seis-sigma.com

    12

    If we take the standard deviation of the 3 process as numeraire, let

    say 3 , the standard deviations of the other processes will be: 4= 0.753;

    5= 0.63; 6= 0.53 . Let us also assume that the specification limits are

    at m33/ n . For n=4 and a shift, say, to the right, we can calculate the

    probability of defectives and the average number of samples (ARL or

    Average Run Length) to detect a change in the mean of 1.5 .

    For the above mentioned processes we get:

    Table 8: ARL

    Six SigmaMetric

    StandardDeviation

    3

    4

    5

    6

    3

    0.753

    0.63

    0.53

    Probability ofDefectives

    after the Shift

    Expected Numberof samples to

    detect the Shift

    2

    6.42

    43.45

    740.76

    0.5

    0.158655

    0.02275

    0.001349

    Average Run Length

    n/3mn/3m 33 +=+

    n/3mn/4m 34 +=+

    n/3mn/5m 35 +=+

    n/3mn/6m 36 +=+

    USL

    In the base case, we can detect a shift of the mean, on average, in two

    samples (not in one sample, as Pyzdek states), and, for a 4 process, wecan detect it in 6 or 7 samples, which can be considered a fairly quickdetection.

    Conclusions

    On comparing the Six Sigma Metric with the Taguchi Loss Function itseems that, from the point of view of Society (customers, partners,stockhol ders, suppliers, employees, etc.), a centred four sigma process

    may very well be preferred to an out-of-the-target six sigma process, evenif the latter produces 3.4 defective per million and the former 32defective per million (almost tenfold).

  • 8/3/2019 Six Sigma vs Taguchi

    13/14

    www.calidad-seis-sigma.com

    13

    Is the effort of going from a four sigma process to a six sigma one worththe cost that it encompasses? Why the six-sigma methodology claim the

    inevitability of a 1.5 shift in the mean of the process (and so the needto go to six sigma) if we can detect such a shift in six or seven samples

    (and bring back the process on target), for a 4 sigma process?

    The richness and important consequence of the Taguchi Loss Function isthe fact that the farther the products characteristic varies from the targetvalue, the greater the loss. Further, this loss is a continuous function and

    not a sudden step and it i l lustrates the point that merely making a productwithin the specification limits (and, therefore, only measuring thepercentage or ppm defective) does not necessary mean that the product is

    of good quality, since good quality is defined by Dr. Taguchi as keeping

    the product characteristic on target with low variation .

    It seems that much research is needed on the costs and benefits of going

    from a 4 process to a 6 process, specially, if one considers that the

    Taguchi Loss Function is the right metric to apply.

    References

    Arimany de Pablos, L. (2001),Proyecto Docente e Investigador.

    Universidad Politcnica de Madrid.

    Arimany de Pablos, L. (2000), On the Six Sigma Quality Metric . 44 t h

    European quality Congress Proceedings. Volume 1, pp 179-186.Budapest. Hungary. EOQ.

    Arimany de Pablos, L. (1995), La Estadstica en la Calidad: La Funcinde Calidad de Taguchi. Actas del VI Congreso Nacional de la Calidad.

    Madrid. Spain

    Arimany de Pablos, L. (1991), La Funcin de Calidad de Taguchi y elConsumo de Energa. Actas de las V Jornadas de la Calidad en laIndustria energtica. Crdoba. Spain

    Arimany de Pablos, L. (1989), Ingeniera de la Calidad: Taguchi ese

    conocido tan desconocido. Actas del IV Congreso Nacional de la Calidad.Madrid. Spain

  • 8/3/2019 Six Sigma vs Taguchi

    14/14

    www.calidad-seis-sigma.com

    14

    Arimany de Pablos, L. (1986), Wheatstone Bridge: G.E.P. Box vs. G.Taguchi. A comparative analysis . Actas de la XVI Reunin de la

    S.E.I.O. Mlaga. Spain.

    Harry, M. and Schroeder, R. (2000), Six Sigma . Currency & Doubleday.

    Noguera, J. and Nielsen, T. (1992), Implementing Six Sigma for Interconnect Technology. ASQC Quality Congress Transactions, pp 538-

    544. Milwaukee. Wis. ASQC.

    Pyzdek, T. (2000), The Six Sigma Handbook. McGraw-Hill.

    Taguchi, G. (1985), Introduction to Quality Engineering. AsianProductivity Organization.