measurement system analysis

76
Measurement System Analysis*- Tutorial Govind Ramu, P. Eng, ASQ Fellow ASQ CMQOE, CQE, CSSBB, CQA,CSQE, CRE Quality Manager, Six Sigma MBB JDSU, Milpitas ASQ Silicon Valley Two day Quality Conference Theme: Road to Innovation Quality Track: Statistics and Reliability * Material compiled from external resources, enhanced with additional presenter’s materials for better understanding. If you see this picture in the slide, Class Room Hands on Exercise involved. Bring your Laptop & MINITAB 16

Upload: govind-ramu

Post on 21-Jan-2015

1.646 views

Category:

Documents


20 download

DESCRIPTION

This material was presented at ASQ Silicon Valley Conference- 21st Oct 2010.

TRANSCRIPT

Page 1: Measurement System Analysis

Measurement System Analysis*- Tutorial

Govind Ramu, P. Eng, ASQ FellowASQ CMQOE, CQE, CSSBB, CQA,CSQE, CRE

Quality Manager, Six Sigma MBB

JDSU, Milpitas

ASQ Silicon Valley Two day Quality ConferenceTheme: Road to Innovation QualityTrack: Statistics and Reliability

* Material compiled from external resources, enhanced with additional presenter’s materials for better understanding.

If you see this picture in the slide, Class Room Hands on Exercise involved.Bring your Laptop & MINITAB 16

Page 2: Measurement System Analysis

2

Measurement Systems AnalysisHow will you know the measures are

accurate, linear, stable, repeatable andreproducible?

Data Collection PlanWho will

Do it?Type of Operational Measurement Data Tags Needed Data Collection Person(s) What? Where? When? How Many?

Measure Measure Definition or Test Method to Stratify the Data Method Assigned

Name of X or Y Clear definition of Visual Data tags are Manual? State What Location How The numberparameter attribute or the measurement inspection defined for the Spreadsheet? who has measure is for often of data

or condition discrete defined in such a or automated measure. Such Computer based? the being data the points to be data, way as to achieve test? as: time, date, etc. responsibility? collected collection data collected

measured product or repeatable results Test instruments location, tester, is per sampleprocess from multiple are defined. line, customer, collected

data observers buyer, operator,Procedures for etc.data collectionare defined.

Sample PlanDefine What to Measure Define How to Measure

Measurement System Analysis

Control Plan, FMEA, Customer Specification, manufacturing Work instructions CTQ Vs CTP matrix are typical sources of input for “What to measure?”

Responsibility for Measurement System Qualification and release?Responsibility for measurement training?Who will measure?

Page 3: Measurement System Analysis

3

Accuracy & Precision

Gage Bias

Linearity

Gage Repeatability

Gage Reproducibility

Gage R&R

% Tolerance Gage R&R

% Process Variation Gage R&R

Gage Resolution

Number of Distinct Categories

Stability

Attribute R & R

Measurement Systems Analysis (MSA)Topics

Page 4: Measurement System Analysis

4

Precise & Not Accurate

Accurate & Not Precise

Accurate & Precise

Target Target Target

Measurement Systems Analysis (MSA)Precision vs. Accuracy

Repeatability portion of GRR studies refers to Precision

Bias refers to Accuracy

Page 5: Measurement System Analysis

5

Width

There Seems To Be A Lot of Variation in Our Widgets!

How Good Is Our Measurement System ?

Freq

uenc

y

Widget Machine

Measurement Systems Analysis (MSA)Context

Page 6: Measurement System Analysis

6

Step 1: A Standard Is Created in Test Lab

Width = 1.032 mmStep 2: Standard Is Measured with Your Equipment

Repeat Measurements Many Times

Always Re-Set Part in Test Fixture Between Measurements

Standard

Measurement Systems Analysis (MSA)Gage Bias

Page 7: Measurement System Analysis

7

Step 3: Plot Data and Compare to Standard

0 1.032

X = 1.046

0.014

Step 4: Calculate Correction (Bias) To Be Applied To Your Measurements

Bias = (1.046 – 1.032) / 1.032 = 1.4%Corrected Measurement = (0.986) x (Your Measurements)

What You Measured

“True” Value

n= =

∑X

Xi

i 1

n

fi

Definition of Mean:

Width

Freq

uenc

y

Measurement Systems Analysis (MSA)Gage Bias - Measurement

Due to linearity errors, you may need to use several standards across your measurement scale to evaluate whether the same correction formula applies everywhere across the scale.

Page 8: Measurement System Analysis

8

Linearity

Interpreting the resultsThe %Linearity (absolute value of the slope * 100) is 13.2, which means that the gage linearity accounts for 13% of the overall process variation.The %Bias for the reference average is 0.3, which means that the gage bias accounts for less than 0.3% of the overall process variation.

Page 9: Measurement System Analysis

To address actual process variability, the variation due to the measurement system must first be identified and separated from that of the process

Possible Sources of Variation

ProcessInputs Outputs Inputs MeasurementProcess

Outputs• Observations• Measurements• Data

Long-term Process Variation

Actual Process Variation

Accuracy (Bias)

Measurement Variation

Observed Process Variation

Short-term Process Variation

Variation due to gage

Variation due to operator

Precision (Pure Error)

Stability (time dependent)

Linearity (value dependent)

Repeatability

Reproducibility

within sample variation Operator X Part

Interaction

Page 10: Measurement System Analysis

10

How Repeatable Are Measurements Made By One Appraiser?

Measure 1 Production Part Over and Over Again

What We Expected: What We Observed:

Width WidthWhy So Much Variation?

Measurement Systems Analysis (MSA)Gage Repeatability

Page 11: Measurement System Analysis

11

SGage

5.15 SGage

Gage Repeatability

0.5%0.5%

Standard Deviation of Gage (one Appraiser)

Now Gage Repeatability Can Be Calculated

= =

∑i 1

n

Sn -1

fi −( )X X

i

2

Definition of Standard Deviation:

Freq

uenc

y

Width

99% of Measurements Fall in the Gage

Repeatability Range

X

Measurement Systems Analysis (MSA)Gage Repeatability - Measurement

Page 12: Measurement System Analysis

12

How Reproducible Are Measurements Made By Several Appraisers?

Appraiser #1:

Appraiser #2:

Appraiser #3:

All Measurements Pooled Together

Appraiser 1Appraiser 3

Appraiser 2

Measurement Systems Analysis (MSA)Gage Reproducibility

Page 13: Measurement System Analysis

13

T P MS S S= +2 2

Observed Variation (ST)

Measurement Variation (SM)

MSM EV AVS S S= +

2 2

Repeatability Issue

Reproducibility Issue

EV = Equip Variation AV = Appraiser Variation

5.15 SM Gage Repeatability & Reproducibility

Now Let’s Look at the Widget Process Again . . .

Process Variation (SP)Width

Width

ST

Measurement Systems Analysis (MSA)Gage Repeatability and Reproducibility

Page 14: Measurement System Analysis

14

USLLSL USLLSL

Resolution Refers To How Many Intervals Are Between USL and LSL

It is highly recommended that Resolution Is ≥ Than 10 divisions of your process variation, 5% of your specification width.

Measurement Systems Analysis (MSA)Resolution

Page 15: Measurement System Analysis

15

T P MS S S= +2 2

ST

Width

Observed Variation of Process Output

Number of Distinct Categories = 1.41 x SP

SM

Categories < 2 ---- Gage Has No Value for Controlling Process

Categories = 2 ---- Equivalent To Discrete Measurement

Categories > 4 ---- Acceptable for Measuring / Improving Process

Measurement Systems Analysis (MSA)Number of Distinct Categories

Page 16: Measurement System Analysis

16

Guidelines for NDC

Page 17: Measurement System Analysis

17

Have Several Appraisers Measure 1 Product Many Times with 1 Gage and Compare Results to the Specification Range :

UPPER SPECLOWER SPEC

Δgood

Δbad

ΔspecΔgood /

ΔspecΔbad /MS

MSSmall Gage Variability

Large Gage Variability

Good Measurement

System:

Bad Measurement

System:

≥ 0 30.

≈ 010.

Recommendation: It Is Best To Repeat This Procedure on Several Parts Across the Full Specification Range

Make Sure You Evaluate Gage Accuracy, Too!

Measurement Systems Analysis (MSA)% Tolerance Gage R&R

Page 18: Measurement System Analysis

18

Freq

uenc

y

Cycle Time

USL

0

Sometimes It Is Not Possible To Calculate % Tolerance GR&R - - - The Spec Is One – Sided!

% Process Variation GR&R =SM X 100

Shistorical

Process Standard Dev from Historical Data

6*Historical Std Dev

Measurement Systems Analysis (MSA)% Process Variation Gage R&R

Page 19: Measurement System Analysis

19

% Tolerance GR&R =5.15 SM

USL - LSLX 100

UPPER SPECLOWER SPEC

Δ

Diameter

Δ = 5.15*SM

SM

Measurement Systems Analysis (MSA)% Tolerance Gage R&R - Measurement

Page 20: Measurement System Analysis

20

Long TermCapability

Short TermCapability

xA

xB

xC

True Value

Repeatability

Reproducibility

Accuracy

Stability

First period of time Second period of time

xA

xB

xC

Measurement Systems Analysis (MSA)The Big Picture

Page 21: Measurement System Analysis

21

TYPE 1 Gauge Study

Use the Type 1 Gage Study to evaluate the capability of a measurement process. This study evaluates the combined effects of bias and repeatability based on multiple measurements from a single part.

Page 22: Measurement System Analysis

22

What Dangers Exist

LSL USL

True value of the part

Accepting BAD product Rejecting GOOD product

Operator Variation

Instrument Variation

True value of the part

Operator Variation

Instrument Variation

Page 23: Measurement System Analysis

23

Chasing Ghosts

Parts

Production Variation

Target

What wasproduced

What wasobserved

Process Adjustment

Operator Variation

Instrument Variation

Page 24: Measurement System Analysis

24

Short-Term Vs Long-Term Capability

LSL USLTarget

Time 1

Time 2

Time 3

Time 4

Over long term conditions, a “typical” process will shift and drift by approximately 1.5 standard deviations*.

“Short-term capability” (Cp, Cpk)

“Long-term performance” that includes changes to material, multiple shifts, Operators, environmental changes (Pp, Ppk)

(Back to Basics)

Page 25: Measurement System Analysis

25

Capability & GRR Grid

% G

R&

R*

Cpk/Ppk*

Low

High

Low High

>20%

<20%

<1.1 >1.1* GGR 20%, Cp, Cpk 1.1 are an example, decide what is acceptable for your organization.

Page 26: Measurement System Analysis

26

Scenario- High GR&R/Low Cp & Cpk

LSL USL

True value of the part

Accepting BAD product Rejecting GOOD product

Instrument VariationRepeatability

Operator VariationReproducibility

Instrument VariationRepeatability

True value of the part

Operator VariationReproducibility

ProcessShift

Page 27: Measurement System Analysis

27

Scenario- low GR&R/Low Cp & Cpk

LSL USL

True value of the part

Accepting BAD product Rejecting GOOD product

Instrument VariationRepeatability

Operator VariationReproducibility

True value of the part

Instrument VariationRepeatability

Operator VariationReproducibility

ProcessShift

Page 28: Measurement System Analysis

28

Scenario- High GR&R/High Cp/ High Cpk

LSL USL

True value of the part

Accepting BAD product Rejecting GOOD product

Instrument VariationRepeatability

Operator VariationReproducibility

Instrument VariationRepeatability

True value of the part

Operator VariationReproducibility

Page 29: Measurement System Analysis

29

Scenario- low GR&R/High Cp/Cpk

LSL USL

True value of the part

Accepting BAD product Rejecting GOOD product

Instrument VariationRepeatability

Operator VariationReproducibility

True value of the part

Instrument VariationRepeatability

Operator VariationReproducibility

Page 30: Measurement System Analysis

30

Prioritizing Improvement Efforts

•If new test station/equipment added, operator changed, equipment overhauled, new GRR study is required. If no changes 6 months frequency ofGRR monitoring is a good practice.** If there has been sudden change in process variation (For good or bad), extended period of lack of stability, a new study has to be conducted and control limits to be recalculated. If no changes, 6 months frequency of review of control limits is a good practice.

CTQ: Critical to Quality Characteristics, CTP: Critical to Process Parameters Relationship between CTP to CTQ to be established upfront.

Pro

duct

Pro

cess

CTQ1

CTQ2

CTQ3

CTP1

CTP2

CTP3

LCL UCL Stability** Cp/Pp Cpk/Ppk

DateCL

ESTB.GRR%DateGRR*

Alpha/Beta Risk%

GRR/ CL Next due

Date

Significant Moderate Weak

CTQ 1 2 3

1

2

3

CTP

Relationship

01/07

01/07

01/07

01/07

01/07

01/07

07/07

07/07

07/07

07/07

07/07

07/07

37%

8%

25%

7%

12%

25%

01/07

01/07

01/07

01/07

01/07

01/07

20

1.30

15

200

1.5

1.7

24

1.80

18

208

1.7

2.0

NO

YES

NO

YES

YES

NO

0.8

0.9

1.2

1.3

1.00

0.95

0.6

0.88

1.15

1.25

0.92

0.82

8/20

3/8

1/2

0.03/0.07

4/11

6/17

Page 31: Measurement System Analysis

31

Relationship between Gauge R & R, Process Capability

Page 32: Measurement System Analysis

Practical Applications

Page 33: Measurement System Analysis

33

Planning for GR & R

• Verify if the Measurement System is Calibrated. (You might use the GR & R for future Test System comparison). Linearity- Bias study preferably conducted prior to GR & R?

• Verify if the operators are adequately training in measurement ( incl. loading, setting, aligning, etc).

• Verify if the vibration lighting, temperature, humidity and other factors are conducive to measurement.

• Verify the samples selected represent the process variation. Verify the Die positions selected as representative of wafer map.

• Verify the GR & R Design has minimum 15 (n x r)

• Understand the Measurement System Configuration & Human interactions required in measurement process.

Page 34: Measurement System Analysis

34

%Contribution

Source VarComp (of VarComp)

Total Gage R&R 1.40E-08 12.27

Repeatability 7.89E-09 6.93

Reproducibility 6.07E-09 5.34

Appraiser No 4.25E-09 3.73

Appraiser No*Part No 1.83E-09 1.61

Part-To-Part 9.99E-08 87.73

Total Variation 1.14E-07 100.00

StdDev Study Var %Study Var %Tolerance

Source (SD) (5.15*SD) (%SV) (SV/Toler)

Total Gage R&R 1.18E-04 6.09E-04 35.02 15.21

Repeatability 8.88E-05 4.57E-04 26.33 11.44

Reproducibility 7.79E-05 4.01E-04 23.10 10.03

Appraiser No 6.52E-05 3.36E-04 19.32 8.39

Appraiser No*Part No 4.27E-05 2.20E-04 12.67 5.50

Part-To-Part 3.16E-04 1.63E-03 93.67 40.69

Total Variation 3.37E-04 1.74E-03 100.00 43.44

% Tolerance GR&R

Total Variation

Process Variation

Measurement Variation

Appraiser Variation

Equipment Variation

Measurement Systems Analysis (MSA)Gage Analysis Using Minitab – Example: Output

Page 35: Measurement System Analysis

35

PartOperatorBy2

Operator2

ityRepeatabil2

Product2

Total2 σσσσσ +++=

Measurement Systems Analysis (MSA)In summary

100.00%

87.73% 12.27%

6.93%

6.93%

5.34%

3.73% 1.61%

Overall Part to Part Repeatability Operator Operator by Part100.00% 87.73% 6.93% 3.73% 1.61%

Reproducibility

12.27%

87.73%

Components of Variation

5.34%

5.34%

Minitab Example Result

Page 36: Measurement System Analysis

36

Gage name:Date of study:Reported by:Tolerance:Misc:

0

0.3755

0.3760

0.3765 1 2 3

Xbar Chart by Appraiser No

Sam

ple

Mea

n

Mean=0.3758UCL=0.3760

LCL=0.3757

0

0.0000

0.0001

0.0002

0.0003

0.0004 1 2 3

R Chart by Appraiser No

Sam

ple

Rang

e

R=0.00015

UCL=3.86E-04

LCL=0

1 2 3 4 5 6 7 8 9 10

0.37550.37560.37570.37580.37590.37600.37610.37620.37630.37640.3765

Part No

Appraiser NoAppraiser No*Part No Interaction

Aver

age

1 2 3

1 2 3

0.3755

0.3760

0.3765

Appraiser No

By Appraiser No 1 2 3 4 5 6 7 8 9 10

0.3755

0.3760

0.3765

Part No

By Part No%Contribution %Study Var %Tolerance

Gage R&R Repeat Reprod Part-to-Part0

50

100

Components of Variation

Perc

ent

Target Run Out Diagnostics Chart To Isolate Problems

Measurement Systems Analysis (MSA)Gage Analysis Using Minitab – Example, contd.

Page 37: Measurement System Analysis

37

1 2 3 4 5

Excellent MarginalMarginal Un-AcceptableUn-Acceptable

Solder JointFeed through

Scale:

This Scale Is Discrete ---- There Is No 2.3 or 3.15 !

To Test Our Measurement System, We Need Discrete GRR

Circuit Board

Measurement of Solder Quality:

Measurement Systems Analysis (MSA)Continuous vs. Discrete Gage R&R

Page 38: Measurement System Analysis

38

• Example:

Select 15 parts which span the full range of your “current part capability.” In other words, your sample should include extremes. Use 3 “trained” appraisers to randomly appraise each part 2 times (not consecutively). Also, select an “expert” to appraise each part 1 time (or use a set of known standards). Appraisals by the “expert” are considered to be the “reference values.” The Minitab looks for consistency “within” and “between” appraisers.

• You will need to continue to improve “measurement procedures / definitions” and “training” of appraisers until you can exceed a score of 90% agreement. A score greater than 95% is considered to be excellent.

Measurement Systems Analysis (MSA)Discrete Gage R&R – How to

Page 39: Measurement System Analysis

39

Discrete Gage R&R

Sample Standard AppraiserA-1 AppraiserA-2 AppraiserB-1 AppraiserB-2 AppraiserC-1 AppraiserC-21 Pass Pass Pass Pass Pass Fail Fail2 Pass Pass Pass Pass Pass Fail Fail3 Fail Fail Fail Fail Pass Fail Fail4 Fail Fail Fail Fail Fail Fail Fail5 Fail Fail Fail Pass Fail Fail Fail6 Pass Pass Pass Pass Pass Pass Pass7 Pass Fail Fail Fail Fail Fail Fail8 Pass Pass Pass Pass Pass Pass Pass9 Fail Pass Pass Fail Fail Fail Fail10 Fail Pass Pass Fail Fail Fail Fail11 Pass Pass Pass Pass Pass Pass Pass12 Pass Pass Pass Pass Pass Pass Pass13 Fail Fail Fail Fail Fail Fail Fail14 Fail Fail Fail Fail Pass Fail Fail

Within Each Appraiser(Repeatability for Each Appraiser)

Appraiser # Inspected # Matched Percent (%)

A 14 14 100.0

B 14 11 78.6

C 14 14 100.0

Each Appraiser vs Standard(Accuracy of Each Appraiser)

Appraiser # Inspected # Matched Percent (%)

A 14 11 78.6

B 14 10 71.4

C 14 11 78.6

Between All Appraisers(Repeatability & Reproducibility)

# Inspected # Matched Percent (%)

14 7 50.0

All Appraisers vs StandardAssessment Agreement (Accuracy)

# Inspected # Matched Percent (%)

14 6 42.9

Known Standards Appraisers A, B, C

Measurement Systems Analysis (MSA)Discrete Gage R&R – Example

Page 40: Measurement System Analysis

40

A B C

40

50

60

70

80

90

100

Appraiser

Per

cent

Within Appraiser

A B C

40

50

60

70

80

90

100

Appraiser

Per

cent

Appraiser vs Standard

Assessment Agreement

[ , ] 95.0% CI

Percent

Within Each Appraiser(Repeatability for Each Appraiser)

Appraiser # Inspected # Matched Percent (%)

A 14 14 100.0

B 14 11 78.6

C 14 14 100.0

Each Appraiser vs Standard(Accuracy of Each Appraiser)

Appraiser # Inspected # Matched Percent (%)

A 14 11 78.6

B 14 10 71.4

C 14 11 78.6

To Shrink “Confidence Intervals” Increase Number of Samples and Repetitions

Measurement Systems Analysis (MSA)Discrete Gage R&R – Example: Diagnostics

Page 41: Measurement System Analysis

41

• Discrete GRR > 90%, Recommended

• if < 90%, use graphical output to diagnose

• Observations:

• Appraisers A and C are very consistent “within” themselves but are not consistent with the expert (or standard).

• We can improve Appraisers A and C by re-training them.

• Appraiser B is not consistent in grading the individual “parts.” We need to investigate why there is a lack of consistency. (If “internal consistency” cannot be improved for Appraiser C, it will probably not be useful to re-train them).

• Confidence interval: The GR&R in this example was based on 14 parts observed by 3 appraisers with 2 observations by each appraiser. The large confidence interval (which is an indication of level of uncertainty) is due to the small number of “samples.” To decrease the confidence interval (and improve our certainty), we need to increase the number of “samples” and “repetitions.”

Measurement Systems Analysis (MSA)Discrete Gage R&R – Example: Analysis

Page 42: Measurement System Analysis

42

Measurement Systems Analysis (MSA)Discrete Gage R&R – Example: in Minitab

Page 43: Measurement System Analysis

43

Measurement Systems Analysis (MSA)Gage R&R – Decision on Measurement Capability

Excellent GRR Acceptable GRR Unacceptable GRR

GRR as % of Tolerance < 10% <30% >30%GRR as %

Contribution to Variation

<10% >10%

No. of distinct Categories >4 <4

Discrete Y Appraiser vs. Standard

Lower Confidence level (at 95% Confidence)

> 90%

Variable Y

If GRR acceptability criteria is not met, do not proceedInvestigate the problem areas

Repeatability (Appraiser training)Reproducibility (Measurement system, SOP)Accuracy (Calibration)

Repeat GRR studies until Acceptable

Page 44: Measurement System Analysis

44

Automobile Industry Action Group (AIAG) Guidelines

If the Total Gage R&R contribution in the %Study Var column (% Tolerance, %Process) is:Less than 10% - the measurement system is acceptable.Between 10% and 30% - the measurement system is acceptable depending on the application, the cost of the measuring device, cost of repair, or other factors. Greater than 30% - the measurement system is unacceptable and should be improved.

If you are looking at the %Contribution column, the corresponding standards are: Less than 1% - the measurement system is acceptable.Between 1% and 9% - the measurement system is acceptable depending on the application, the cost of the measuring device, cost of repair, or other factors. Greater than 9% - the measurement system is unacceptable and should be improved.

Page 45: Measurement System Analysis

45

Automobile Industry Action Group (AIAG) GuidelinesNumber of Distinct Categories

Minitab calculates the number in this statement by dividing the standard deviation for Parts by the standard deviation for Gage, then multiplying by 1.41 and truncating this value. This number represents the number of non-overlapping confidence intervals that will span the range of product variation. You can also think of it as the number of groups within your process data that your measurement system can discern.

Imagine you measured 10 different parts, and Minitab reported that your measurement system could discern 4 distinct categories. This means that some of those 10 parts are not different enough to be discerned as being different by your measurement system. If you want to distinguish a higher number of distinct categories, you need a more precise gage.

The Automobile Industry Action Group (AIAG) [1] suggests that when the number of categories is less than 2, the measurement system is of no value for controlling the process, since one part cannot be distinguished from another. When the number of categories is 2, the data can be divided into two groups, say high and low. When the number of categories is 3, the data can be divided into 3 groups, say low, middle and high. A value of 5 or more denotes an acceptable measurement system.

Page 46: Measurement System Analysis

46

Page 47: Measurement System Analysis

© 2010 Minitab, Inc.

MSA – Other Scenarios• Other factors• Comparing two gages

• Joel Smith• MINITAB

Page 48: Measurement System Analysis

© 2010 Minitab, Inc.

MSA – Other Scenarios

Other factors

Comparing two gages

Page 49: Measurement System Analysis

© 2010 Minitab, Inc.

Other factors

Page 50: Measurement System Analysis

© 2010 Minitab, Inc.

Other factors

Page 51: Measurement System Analysis

© 2010 Minitab, Inc.

Other Factors

What are “other factors”• Influence measurement• Exist in production environment• May interact with operator or other factors

Page 52: Measurement System Analysis

© 2010 Minitab, Inc.

Why Reproducability?

Recall from Govind:• Repeatability represents “pure error”• Isolates ability to repeat measurements under equal conditions

Practically:• Reproducability likely similar with different “like” devices• Maintain isolation of effect from device

Page 53: Measurement System Analysis

© 2010 Minitab, Inc.

Effect on Results

Inclusion does not affect rules:• # Distinct Categories (<2, 2-4, >4)• % Contribution (<1%, 1-9%, >9%)• % Tolerance (<10%, 10-30%, >30%)

Inclusion will increase Reproducability• Statistical test important (more later)

Page 54: Measurement System Analysis

© 2010 Minitab, Inc.

Types of Factors

Fixed• Specific levels are important

Random• Specific levels are not important

Nested• Levels of one factor dependent on another

Interactions• Effect of one factor dependent on level of another

Page 55: Measurement System Analysis

© 2010 Minitab, Inc.

Weld Quality Example

We use an electromagnetic test to evaluate weld quality on a bike frame

Page 56: Measurement System Analysis

© 2010 Minitab, Inc.

Weld Quality Example

Our measurement is impedance

We randomly select 10 bicycle frames for testing by 3 operators

We identify the following factors as potentially affecting measurement:

• Temperature• Humidity

Page 57: Measurement System Analysis

© 2010 Minitab, Inc.

Weld Quality Example

Experiment is run with:• 3 Operators (Levi, George, Christian)• 10 Bikes• 2 Temperature Levels (70 and 80)• 2 Humidity Levels (50, 70)

High number of runs (240)

Page 58: Measurement System Analysis

© 2010 Minitab, Inc.

Gage R&R (Expanded)

Stat > Quality Tools > Gage R&R Study > Gage R&R (Expanded)

Page 59: Measurement System Analysis

© 2010 Minitab, Inc.

Weld Quality – ANOVA Table

ANOVA Table with Terms Used for Gage R&R Calculations

Source DF Seq SS Adj SS Adj MS F PBike 9 0.0406778 0.0406778 0.0045198 1623.64 0.000Operator 2 0.0010840 0.0010840 0.0005420 271.63 0.000Temperature 1 0.0002827 0.0002827 0.0002827 101.55 0.000Humidity 1 0.0000814 0.0000814 0.0000814 40.82 0.000Bike*Temperature 9 0.0000251 0.0000251 0.0000028 1.40 0.192Repeatability 217 0.0004330 0.0004330 0.0000020Total 239 0.0425840

Page 60: Measurement System Analysis

© 2010 Minitab, Inc.

Weld Quality - % Contribution

Variance Components

%ContributionSource VarComp (of VarComp)Total Gage R&R 0.0000118 5.90Repeatability 0.0000020 1.00Reproducibility 0.0000098 4.90

Operator 0.0000068 3.37Temperature 0.0000023 1.17Humidity 0.0000007 0.33Bike*Temperature 0.0000001 0.03

Part-To-Part 0.0001882 94.10Bike 0.0001882 94.10

Total Variation 0.0002000 100.00

Page 61: Measurement System Analysis

© 2010 Minitab, Inc.

Weld Quality – Gage EvaluationProcess tolerance = 0.02

Gage Evaluation

Study Var %Study Var %ToleranceSource StdDev (SD) (6 * SD) (%SV) (SV/Toler)Total Gage R&R 0.0034359 0.0206157 24.30 103.08Repeatability 0.0014126 0.0084755 9.99 42.38Reproducibility 0.0031321 0.0187929 22.15 93.96

Operator 0.0025981 0.0155886 18.37 77.94Temperature 0.0015273 0.0091635 10.80 45.82Humidity 0.0008137 0.0048821 5.75 24.41Bike*Temperature 0.0002563 0.0015379 1.81 7.69

Part-To-Part 0.0137189 0.0823132 97.00 411.57Bike 0.0137189 0.0823132 97.00 411.57

Total Variation 0.0141426 0.0848556 100.00 424.28

Number of Distinct Categories = 5

Page 62: Measurement System Analysis

© 2010 Minitab, Inc.

Weld Quality - Misclassification

Probabilities of Misclassification

Joint Probability

Part is bad and is accepted 0.054Part is good and is rejected 0.067

Conditional Probability

False Accept 0.115False Reject 0.126

Page 63: Measurement System Analysis

© 2010 Minitab, Inc.

Weld Quality - Graphical

Page 64: Measurement System Analysis

© 2010 Minitab, Inc.

Effect of Factors

What if the additional factors were not included?• % Contribution and % StudyVar would be lower• Probability of Misclassification lower• % Tolerance lower• Number of Distinct Categories higher

BUT:• This is an illusion• Goal is to assess MS, not minimize MS Variation at this point• Analysis does not change reality

Page 65: Measurement System Analysis

© 2010 Minitab, Inc.

Effect of Factors

Analysis without “other factors” looks better

Which will you experience in production?

Statistic With Factors W/O Factors%Contribution 5.90 5.21%Tolerance 103.08 96.45Distinct Categories 5 6P(Accepting Bad) .054 .051P(Rejecting Good) .067 .063

Page 66: Measurement System Analysis

© 2010 Minitab, Inc.

Comparing Two Gages

Page 67: Measurement System Analysis

© 2010 Minitab, Inc.

Comparing Two Gages

Electromagnetic testing turns out to be very expensive

A cheaper alternative is available

How do we determine if it is equivalent?• Is the new gauge adequate?• Is it biased?• Does it exhibit linearity?

Page 68: Measurement System Analysis

© 2010 Minitab, Inc.

Comparing Two Gages

Alternate gage results in slightly more variation

Still meets acceptability criteria

Statistic Original Alternate%Contribution 5.90 6.69%Tolerance 103.08 110.84Distinct Categories 5 5P(Accepting Bad) .054 .057P(Rejecting Good) .067 .072

Page 69: Measurement System Analysis

© 2010 Minitab, Inc.

Orthogonal Regression

Orthogonal Regression• Assumes measurement error in X and Y direction• Requires prior estimate of these errors• Tests bias and linearity

From our MSA’s:• Original Gage Var: .0000118• Alternate Gage Var: .0000136• Ratio = 1.1525 (Alternate/Original)

Choice of X and Y are arbitrary

Page 70: Measurement System Analysis

© 2010 Minitab, Inc.

Orthogonal Regression

Stat > Regression > Orthogonal Regression

Page 71: Measurement System Analysis

© 2010 Minitab, Inc.

Orthogonal Regression

Page 72: Measurement System Analysis

© 2010 Minitab, Inc.

Orthogonal Regression

What do we want to see here?• Constant ~ 0• Impedance ~ 1

Regression EquationImpedance2 = - 0.114 + 1.009 Impedance

Coefficients

Predictor Coef SE Coef Z P Approx 95% CIConstant -0.11388 0.136040 -0.8371 0.403 (-0.380513, 0.15275)Impedance 1.00948 0.011336 89.0478 0.000 ( 0.987263, 1.03170)

Page 73: Measurement System Analysis

© 2010 Minitab, Inc.

Linearity and Bias

Page 74: Measurement System Analysis

© 2010 Minitab, Inc.

Orthogonal Regression

To evaluate linearity and bias use CI’s:• 0 is contained within Constant CI• 1 is contained within Impedance CI

No evidence of linearity or bias difference between gages

Regression EquationImpedance2 = - 0.114 + 1.009 Impedance

Coefficients

Predictor Coef SE Coef Z P Approx 95% CIConstant -0.11388 0.136040 -0.8371 0.403 (-0.380513, 0.15275)Impedance 1.00948 0.011336 89.0478 0.000 ( 0.987263, 1.03170)

Page 75: Measurement System Analysis

© 2010 Minitab, Inc.

Comparing Two Gages

We have determined:• Alternate gage has slightly more variation• Alternate gage does not consistently measure higher or lower

across all impedances• Alternate gage does not measure higher or lower as

impedance increases

Future measurement plan:• Use alternate gage regularly• Schedule original gage tests to re-establish “equivalence”

Page 76: Measurement System Analysis

© 2010 Minitab, Inc.

Questions?