measurement system analysis

80
Operational Excellence Measurement System Analysis Operational Excellence Introduction 1/28/2017 Ronald Morgan Shewchuk 1 Measurement System Analysis (MSA) is the first step of the measure phase along the DMAIC pathway to improvement. You will be basing the success of your improvement project on key performance indicators that are tied to your measurement system. Consequently, before you begin tracking metrics you will need to complete a MSA to validate the measurement system. A comprehensive MSA typically consists of six parts; Instrument Detection Limit, Method Detection Limit, Accuracy, Linearity, Gage R&R and Long Term Stability. If you want to expand measurement capacity or qualify another instrument you must complement the MSA to include Metrology Correlation and Matching.

Upload: ronald-shewchuk

Post on 11-Feb-2017

73 views

Category:

Engineering


3 download

TRANSCRIPT

PowerPoint Presentation

Introduction1/28/2017Ronald Morgan Shewchuk1Measurement System Analysis (MSA) is the first step of the measure phase along the DMAIC pathway to improvement.You will be basing the success of your improvement project on key performance indicators that are tied to your measurement system. Consequently, before you begin tracking metrics you will need to complete a MSA to validate the measurement system. A comprehensive MSA typically consists of six parts; Instrument Detection Limit, Method Detection Limit, Accuracy, Linearity, Gage R&R and Long Term Stability. If you want to expand measurement capacity or qualify another instrument you must complement the MSA to include Metrology Correlation and Matching.

Operational Excellence

Operational Excellence

Measurement System AnalysisIntroduction1/28/2017Ronald Morgan Shewchuk2A poor measurement system can make data meaningless and process improvement impossible. Large measurement error will prevent assessment of process stability and capability, confound Root Cause Analysis and hamper continuous improvement activities in manufacturing operations. Measurement error has a direct impact on assessing the stability and capability of a process. Poor metrology can make a stable process appear unstable and make a capable process appear incapable. Measurement System Analysis quantifies the effect of measurement error on the total variation of a unit operation. The sources of this variation may be visualized as in Figure 7.1 and the elements of a measurement system as in Figure 7.2.

Operational Excellence

Operational Excellence

Measurement System AnalysisIntroduction1/28/2017Ronald Morgan Shewchuk3

Figure 7.1 Sources of Variation

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk4Figure 7.2 Measurement System ElementsEquipmentProceduresEnvironmentPerformance

Operational Excellence

Operational Excellence

Measurement System AnalysisIntroduction1/28/2017Ronald Morgan Shewchuk5Operators are often skeptical of measurement systems, especially those that provide them with false feedback causing them to over-steer their process. This skepticism is well founded since many measurement systems are not capable of accurately or precisely measuring the process. Accuracy refers to the average of individual measurements compared with the known, true value. Precision refers to the grouping of the individual measurements - the tighter the grouping, the higher the precision. The bulls eye targets of Figure 7.3 best illustrate the difference between accuracy and precision.

Operational Excellence

Operational Excellence

Measurement System AnalysisIntroduction1/28/2017Ronald Morgan Shewchuk6Figure 7.3 Accuracy vs Precision The Center of the Target is the Objective

Good Accuracy / Bad PrecisionBad Accuracy / Good PrecisionGood Accuracy / Good PrecisionBad Accuracy / Bad Precision

Operational Excellence

Operational Excellence

Measurement System AnalysisIntroduction1/28/2017Ronald Morgan Shewchuk7Accuracy is influenced by resolution, bias, linearity and stability whereas precision is influenced by repeatability and reproducibility of the measurement system.Repeatability is the variation which occurs when the same operator repeatedly measures the same sample on the same instrument under the same conditions.Reproducibility is the variation which occurs between two or more instruments or operators measuring the same sample with the same measurement method in a stable environment. The total variance in a quality characteristic of a process is described by Eqn 7.1 and Eqn 7.2. The percent contribution of the measurement system to the total variance may be calculated from Eqn 7.3. We want to be able to measure true variations in product quality and not variations in the measurement system so it is desired to minimize 2measurement We will review the steps in a typical measurement system analysis by way of example, first for the case of variables data and then for the case of attribute data.

Operational Excellence

Operational Excellence

Measurement System AnalysisIntroduction1/28/2017Ronald Morgan Shewchuk8

Operational Excellence

Operational Excellence

Measurement System AnalysisInstrument Detection Limit (IDL)1/28/2017Ronald Morgan Shewchuk9Todays measurement devices are an order of magnitude more complex than the gages for which the Automotive Industry Action Group (AIAG) first developed Gage Repeatability and Reproducibility (Gage R&R) studies. Typically they are electromechanical devices with internal microprocessors having inherent signal to noise ratios. The Instrument Detection Limit (IDL) should be calculated from the baseline noise of the instrument. Let us examine the case where a gas chromatograph (GC) is being used to measure the concentration of some analyte of interest. Refer to Figure 7.4.

Operational Excellence

Operational Excellence

Measurement System AnalysisInstrument Detection Limit (IDL)1/28/2017Ronald Morgan Shewchuk10

Figure 7.4 Gas Chromatogram

Operational Excellence

Operational Excellence

Measurement System AnalysisInstrument Detection Limit (IDL)1/28/2017Ronald Morgan Shewchuk11The chromatogram has a baseline with peaks at different column retention times for hydrogen, argon, oxygen, nitrogen, methane and carbon monoxide. Lets say we wanted to calculate the IDL for nitrogen at retention time 5.2 min. We would purge and evacuate the column to make sure it is clean then successively inject seven blanks of the carrier gas (helium). The baseline noise peak at retention time 5.2 min is integrated for each of the blank injections and converted to concentration units of Nitrogen. The standard deviation of these concentrations is multiplied by the Students t statistic for n-1 degrees of freedom at a 99% confidence interval (3.143) to calculate the IDL. This is the EPA protocol as defined in 40 CFR Part 136: Guidelines Establishing Test Procedures for the Analysis of Pollutants, Appendix B. Refer to Figure 7.5 below for the calculation summary.

Operational Excellence

Operational Excellence

Measurement System AnalysisInstrument Detection Limit (IDL)1/28/2017Ronald Morgan Shewchuk12

df = n - 1IDL = T(df, 1-=0.99) * Stdev IDL = 3.143(0.00007044) = 0.0002214 ppm N2Figure 7.5 Instrument Detection Limit (IDL) Calculation

Operational Excellence

Operational Excellence

Measurement System AnalysisMethod Detection Limit (MDL)1/28/2017Ronald Morgan Shewchuk13Method detection limit (MDL) is defined as the minimum concentration of a substance that can be measured and reported with 99% confidence that the analyte concentration is greater than zero as determined from analysis of a sample in a given matrix containing the analyte. MDL is calculated in a similar way to IDL with the exception that the same sample is measured on the instrument with n=7 trials and the sample is disconnected and reconnected to the measurement apparatus between trials. This is called dynamic repeatability analysis. An estimate is made of the MDL and a sample prepared at or near this MDL concentration. The seven trials are then measured on the instrument and the MDL calculated as in Figure 7.6.MDL divided by the mean of the seven trials should be within 10-100%. If this is not the case, repeat the MDL analysis with a starting sample concentration closer to the calculated MDL.

Operational Excellence

Operational Excellence

Measurement System AnalysisMethod Detection Limit (MDL)1/28/2017Ronald Morgan Shewchuk14

df = n - 1MDL = T(df, 1-=0.99) * Stdev MDL = 3.143(0.01801) = 0.05660 ppm N2Figure 7.6 Method Detection Limit (MDL) Calculation

Operational Excellence

Operational Excellence

Measurement System AnalysisMeasurement System Analysis Variables Data1/28/2017Ronald Morgan Shewchuk15A properly conducted measurement system analysis (MSA) can yield a treasure trove of information about your measurement system. Repeatability, reproducibility, resolution, bias, and precision to tolerance ratio are all deliverables of the MSA and can be used to identify areas for improvement in your measurement system. It is important to conduct the MSA in the current state since this is your present feedback mechanism for your process. Resist the temptation to dust off the Standard Operating Procedure and brief the operators on the correct way to measure the parts. Resist the temptation to replace the NIST1 - traceable standard, which looks like it has been kicked around the metrology laboratory a few times. 1National Institute of Standards and Technology

Operational Excellence

Operational Excellence

Measurement System AnalysisMeasurement System Analysis Variables Data1/28/2017Ronald Morgan Shewchuk16To prepare for an MSA you must collect samples from the process that span the specification range of the measurement in question. Include out-of-spec high samples and out-of-spec low samples. Avoid creating samples artificially in the laboratory. There may be complicating factors in the commercial process which influence your measurement system. Include all Operators in the MSA who routinely measure the product. The number of samples times the number of Operators should be greater than or equal to fifteen, with three trials for each sample. If this is not practical, increase the number of trials as per Figure 7.7.

Operational Excellence

Operational Excellence

Measurement System AnalysisMeasurement System Analysis Variables Data1/28/2017Ronald Morgan Shewchuk17Code the samples such that the coding gives no indication to the expected measurement value this is called blind sample coding. Have each sample measured by an outside laboratory. These measurements will serve as your reference values. Ask each Operator to measure each sample three times in random sequence. Ensure that the Operators do not compare notes. We will utilize Minitab to analyze the measurement system described in Case Study III.Samples x OperatorsTrialsS x O 1538 S x O < 1545 S x O < 85S x O < 56

Figure 7.7 Measurement System Analysis Design

Operational Excellence

Operational Excellence

Measurement System AnalysisMeasurement System Analysis Variables Data1/28/2017Ronald Morgan Shewchuk18Case Study III: Minnesota Polymer Co.Minnesota Polymer Co. supplies a special grade of resin to ABC Molding Co. which includes a silica modifier to improve dimensional stability. The product code is POMBLK-15 and the silica concentration specification by weight is 15 2%. Silica concentration is determined by taking a sample of the powdered resin and pressing it into a 4 cm disk using a 25-ton hydraulic press. The sample disk is then analyzed by x-ray fluorescence energy dispersive spectroscopy (XRF-EDS) to measure the silica content. Manufacturing POMBLK-15 is difficult. The silica is light and fluffy and sometimes gets stuck in the auger used to feed the mixing tank. A new process engineer, Penelope Banks, has been hired by Minnesota Polymer. One of her first assignments is to improve POMBLK-15 process control. SPC analysis of historical batch silica concentration results have indicated out-of-control symptoms and poor Cpk. Before Penny makes any changes to the process she prudently decides to conduct a measurement system analysis to find out the contribution of the measurement system to the process variation. Minnesota Polymer is a firm believer in process ownership. The same operator who charges the raw materials, runs the manufacturing process, collects the quality control sample, presses the sample disk and then runs the silica analysis on the XRF-EDS instrument. The operator uses the silica concentration analysis results to adjust the silica charge on the succeeding batch. POMBLK-15 is typically run over a five-day period in the three-shift, 24/7 operation.Penny has collected five powder samples from POMBLK-15 process retains which span the silica specification range and included two out-of-specification samples pulled from quarantine lots. She has asked each of the three shift operators to randomly analyze three samples from each powder bag for silica content according to her sampling plan. Penny has sent a portion of each sample powder to the Companys R&D Headquarters in Hong Kong for silica analysis. These results will serve as reference values for each sample. The following table summarizes the silica concentration measurements and Figure 7.8 captures the screen shots of the MSA steps for Case Study III.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk19

Figure 7.8 Measurement System Analysis Steps Variable DataOpen a new worksheet. Click on Stat Quality Tools Gage Study Create Gage R&R Study Worksheet on the top menu.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk20Figure 7.8 Measurement System Analysis Steps Variable DataEnter the Number of Operators, the Number of Replicates and the Number of Parts in the dialogue box. Click OK.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk21Figure 7.8 Measurement System Analysis Steps Variable DataThe worksheet is modified to include a randomized run order of the samples.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk22Figure 7.8 Measurement System Analysis Steps Variable DataName the adjoining column Silica Conc and transcribe the random sample measurement data to the relevant cells in the worksheet.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk23Figure 7.8 Measurement System Analysis Steps Variable DataClick on Stat Quality Tools Gage Study Gage R&R Study (Crossed) on the top menu.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk24Figure 7.8 Measurement System Analysis Steps Variable DataSelect C2 Parts for Part numbers, C3 Operators for Operators and C4 Silica Conc for Measurement data in the dialogue box. Click the radio toggle button for ANOVA under Method of Analysis. Click Options.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk25Figure 7.8 Measurement System Analysis Steps Variable DataSix (6) standard deviations will account for 99.73% of the Measurement System variation. Enter Lower Spec Limit and Upper Spec Limit in the dialogue box. Click OK. Click OK.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk26Figure 7.8 Measurement System Analysis Steps Variable DataA new graph is created in the Minitab project file with the Gage R&R analysis results.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk27Return to the session by clicking on Window Session on the top menu to view the ANOVA analytical results.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk28Let us more closely examine the graphical output of the Gage R&R (ANOVA) Report for Silica Conc. Figure 7.9 shows the components of variation. A good measurement system will have the lions share of variation coming from the product, not the measurement system. Consequently, we would like the bars for repeatability and reproducibility to be small relative to part-to-part variation.

Figure 7.9 MSA Components of Variation

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk29Figure 7.10 captures the range SPC chart by Operators. The range chart should be in control. If it is not, a repeatability problem is present.

Figure 7.10 MSA Range Chart by Operators

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk30By contrast, the X-bar SPC chart of Figure 7.11 should be out of control. This seems counterintuitive but it is a healthy indication that the variability present is due to part to part differences rather than Operator to Operator differencesFigure 7.11 MSA X-bar Chart by Operators

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk31Figure 7.12 is an individual value plot of silica concentration by sample number. The circles with a cross indicate the mean of the sample data and the solid circles are individual data points. We want a tight grouping around the mean for each sample and we want significant variation between the means of different samples. If we do not have variation between samples the MSA has been poorly designed and we essentially have five samples of the same thing. This will preclude analysis of the measurement system.Figure 7.12 MSA Silica Concentration by Sample Number

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk32Figure 7.13 is a boxplot of silica concentration by Operator. As in Figure 7.12 the circles with a cross indicate the mean concentration for all samples by Operator. The shaded boxes represent the interquartile range (Q3-Q1) for each Operator. The interquartile range (IQR) is the preferred measure of spread for data sets which are not normally distributed. The solid line within the IQR is the median silica concentration of all samples by Operator. If Operators are performing the same, we would expect similar means, medians and IQRs. Figure 7.13 MSA Silica Concentration by Operator

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk33Figure 7.14 is an individual value plot used to check for Operator-Sample interactions. The lines for each Operator should be reasonably parallel to each other. Crossing lines indicate the presence of Operator-Sample interactions. This can happen when Operators are struggling with samples at or near the MDL or if the instrument signal to noise ratio varies as a function of concentration.

Figure 7.14 MSA Sample by Operator Interaction

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk34Let us now focus on the analytical output of the session window as captured in Figure 7.8. Lovers of Gage R&Rs will typically look for four metrics as defined below and expect these metrics to be within the acceptable or excellent ranges specified by Gage R&R Metric Rules of Thumb as shown in Figure 7.15.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk35

Gage R&R Metric UnacceptableAcceptableExcellent% Contribution > 7.7%2.0 - 7.7%< 2%% Study Variation > 28%14 - 28%< 14%% P/T Ratio > 30%8 - 30%< 8%Number of Distinct Categories < 55 - 10> 10

Figure 7.15 Gage R&R Metrics Rules of Thumb

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk36The highlighted output of the Minitab session window indicates a % Contribution of the measurement system of 0.55%. This is in the excellent region. % Study Variation is 7.39% which is also in the excellent region. Precision to Tolerance ratio is 25.37%. This is in the acceptable region. Number of distinct categories is 19, well within the excellent region. Overall, this is a good measurement system. Now, let us proceed to check for linearity and bias by adding the reference concentrations as measured by the Hong Kong R&D Center for each of the samples to the worksheet. Figure 7.16 captures the screen shots necessary for this process.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk37Figure 7.16 Gage Linearity and Bias Study Steps Variable DataReturn to the active worksheet by clicking on Window Worksheet 1 *** on the top menu. Name the adjoining column Reference Conc and enter the reference sample concentration values corresponding to each sample (Part) number.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk38Figure 7.16 Gage Linearity and Bias Study Steps Variable DataClick on Stat Quality Tools Gage Study Gage Linearity and Bias Study on the top menu.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk39Figure 7.16 Gage Linearity and Bias Study Steps Variable DataSelect C2 Parts for Part numbers, C5 Reference Conc for Reference values and C4 Silica Conc for Measurement data in the dialogue box. Click OK.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk40Figure 7.16 Gage Linearity and Bias Study Steps Variable DataA new graph is created in the Minitab project file with the Gage Linearity and Bias Study results.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk41We can see there is a bias between the Hong Kong measurement system and Minnesota Polymers measurement system. The bias is relatively constant over the silica concentration range of interest as indicated by the regression line. The Minnesota Polymer measurement system is reading approximately 0.67 wt % Silica higher than Hong Kong. This is not saying that the Hong Kong instrument is right and the Minnesota Polymer instrument is wrong. It is merely saying that there is a difference between the two instruments which must be investigated. This difference could have process capability implications if it is validated. Minnesota Polymer may be operating in the top half of the allowable spec range. The logical next step is for the Hong Kong R&D center to conduct an MSA of similar design, ideally with the same sample set utilized by Minnesota Polymer.

Operational Excellence

Operational Excellence

Measurement System AnalysisMeasurement System Analysis Attribute Data1/28/2017Ronald Morgan Shewchuk42Case Study IV: Virtual Cable Co.David Raffles Lee has just joined Virtual Cable Co., the leading telecommunications company in the southwest as Chief Executive Officer. David comes to Virtual Cable with over thirty years of operations experience in the telecommunications industry in Singapore. During a tour of one of the Customer Service Centers, David noticed that the customer service agents were all encased in bulletproof glass. David queried the Customer Service Manager, Bob Londale about this and Bob responded, It is for the protection of our associates. Sometimes our customers become angry and they produce weapons. David was rather shocked about this and wanted to learn more about customer satisfaction at Virtual Cable. He formed a team to analyze the measurement of customer satisfaction. This team prepared ten scripts of typical customer complaints with an intended outcome of pass customer was satisfied with the customer service agents response or fail customer was dissatisfied with the response. Twenty customers were coached on the scripts, one script for two customers. These customers committed the scripts to memory and presented their service issue to three different Customer Service Agents at three different Customer Service Centers. Each customer was issued an account number and profile to allow the Customer Service Agent to rate the customers satisfaction level in the customer feedback database as required by Virtual Cables policy. The results are summarized in the attached table and analyzed by the MSA attribute data steps of Figure 7.17.In our next case we will analyze the measurement system used to rate customer satisfaction as described in Case Study IV below.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk43Figure 7.17 Measurement System Analysis Steps Attribute DataOpen a new worksheet. Click on Stat Quality Tools Create Attribute Agreement Analysis Worksheet on the top menu.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk44Figure 7.17 Measurement System Analysis Steps Attribute DataEnter the Number of samples, the Number of appraisers and the Number of replicates in the dialogue box. Click OK.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk45Figure 7.17 Measurement System Analysis Steps Attribute DataThe worksheet is modified to include a randomized run order of the scripts (samples).

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk46Figure 7.17 Measurement System Analysis Steps Attribute DataName the adjoining columns Response and Reference. Transcribe the satisfaction level rating and the reference value of the script to the appropriate cells.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk47Figure 7.17 Measurement System Analysis Steps Attribute DataClick on Stat Quality Tools Attribute Agreement Analysis on the top menu.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk48Figure 7.17 Measurement System Analysis Steps Attribute DataSelect C4 Response for Attribute column, C2 Samples for Samples and C3 Appraisers for Appraisers in the dialogue box. Select C5 Reference for Known standard/attribute. Click OK.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk49Figure 7.17 Measurement System Analysis Steps Attribute DataA new graph is created in the Minitab project file with the Attribute Assessment Agreement results.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk50Figure 7.17 Measurement System Analysis Steps Attribute DataDisplay the analytical MSA Attribute Agreement Results by clicking on Window Session on the top menu.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk51The attribute MSA results allow us to determine the percentage overall agreement, the percentage agreement within appraisers (repeatability), the percentage agreement between appraisers (reproducibility), the percentage agreement with reference values (accuracy) and the Kappa Value (index used to determine how much better the measurement system is than random chance). From the graphical results we can see that the Customer Service Agents were in agreement with each other 90% of the time and were in agreement with the expected (standard) result 90% of the time. From the analytical results we can see that the agreement between appraisers was 80% and the overall agreement vs the standard values was 80%. The Kappa Value for all appraisers vs the standard values was 0.90, indicative of excellent agreement between the appraised values and reference values. Figure 7.18 provides benchmark interpretations for Kappa Values.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk52Figure 7.18 Rules of Thumb for Interpreting Kappa Values

Another way of looking at this case is that out of sixty expected outcomes there were only three miscalls on rating customer satisfaction by the Customer Service Agents included in this study. Mr. Lee can have confidence in the feedback of the Virtual Cable customer satisfaction measurement system and proceed to identify and remedy the underlying root causes of customer dissatisfaction.

Operational Excellence

Operational Excellence

Measurement System AnalysisImproving the Measurement System1/28/2017Ronald Morgan Shewchuk53Improvements to the measurement system should be focused on the root cause(s) of high measurement system variation. If repeatability is poor, consider a more detailed repeatability study using one part and one operator over an extended period of time. Ask the operator to measure this one sample twice per day for one month. Is the afternoon measurement always greater or always lesser than the morning measurement? Perhaps the instrument is not adequately cooled. Are the measurements trending up or down during the month? This is an indication of instrument drift. Is there a gold standard for the instrument? This is one part that is representative of production parts, kept in a climate-controlled room, handled only with gloves and carried around on a red velvet pillow.

Operational Excellence

Operational Excellence

Measurement System AnalysisImproving the Measurement System1/28/2017Ronald Morgan Shewchuk54Any instrument must have a gold standard. Even the kilogram has a gold standard. It is a platinum-iridium cylinder held under glass at the Bureau International des Poids et Mesures in Svres, France. If the gold standard measures differently during the month the measurement error is not due to the gold standard, it is due to the measurement system. Consider if the instrument and/or samples are affected by temperature, humidity, vibration, dust, etc. Set up experiments to validate these effects with data to support your conclusions. If you are lobbying for the instrument to be relocated to a climate-controlled clean room you better have the data to justify this move.

Operational Excellence

Operational Excellence

Measurement System AnalysisImproving the Measurement System1/28/2017Ronald Morgan Shewchuk55If reproducibility is poor, read the Standard Operating Procedure (SOP) in detail. Is the procedure crystal clear without ambiguity which would lead operators to conduct the procedure differently? Does the procedure specify instrument calibration before each use? Does the procedure indicate what to do if the instrument fails the calibration routine? Observe the operators conducting the procedure. Are they adhering to the procedure? Consider utilizing the operator with the lowest variation as a mentor/coach for the other operators. Ensure that the SOP is comprehensive and visual. Functional procedures should be dominated by pictures, diagrams, sketches, flow charts, etc which clearly demonstrate the order of operations and call out the critical points of the procedure.

Operational Excellence

Operational Excellence

Measurement System AnalysisImproving the Measurement System1/28/2017Ronald Morgan Shewchuk56Avoid lengthy text SOPs devoid of graphics. They do not facilitate memory triangulation the use of multiple senses to recall learning. Refresher training should be conducted annually on SOPs with supervisor audit of the Operator performing the measurement SOP.

Operational Excellence

Operational Excellence

Measurement System AnalysisLong Term Stability1/28/2017Ronald Morgan Shewchuk57Now that you have performed analyses to establish the Instrument Detection Limit, Method Detection Limit, Accuracy, Linearity, and Gage R&R metrics of your measurement system and proven that you have a healthy measurement system; you will need to monitor the measurement system to ensure that it remains healthy. Stability is typically monitored through daily measurement of a standard on the instrument in question. If a standard is not available, one of the samples from the Gage R&R can be utilized as a Golden Sample. Each day, after the instrument is calibrated, the standard is measured on the instrument. An Individuals Moving Range (IMR) SPC chart is generated as we have covered in Chapter 6. If the standard is in control then the measurement system is deemed to be in control and this provides the justification to utilize the instrument to perform commercial analyses on process samples throughout the day

Operational Excellence

Operational Excellence

Measurement System AnalysisLong Term Stability1/28/2017Ronald Morgan Shewchuk58If the standard is not in control the instrument is deemed to be nonconforming and a Root Cause Analysis must be initiated to identify the source(s) of the discrepancy. Once the discrepancy has been identified and corrected, the standard is re-run on the instrument and the IMR chart refreshed to prove that the instrument is in control. Figure 7.19 shows daily stability measurements from Case Study III, silica concentration measurement of Golden Sample disk number two.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk59Figure 7.19 Measurement System Long Term Stability

Operational Excellence

Operational Excellence

Measurement System AnalysisMetrology Correlation and Matching1/28/2017Ronald Morgan Shewchuk60Metrology correlation is utilized when comparing two measurement systems. This includes the sample preparation steps required before the actual measurement is conducted as this is part of the measurement system. Metrology correlation and matching assessment is performed when replacing an existing metrology tool with a new metrology tool, expanding measurement capacity by adding a second tool, comparing customer metrology to supplier metrology or comparing a metrology tool at one site to a metrology tool at another site. Metrology correlation analysis is conducted when the two metrology tools are not required to deliver the exact same output. This occurs when the equipment, fixtures, procedures and environment of the two measurement tools are not exactly the same. This is a common situation when comparing customer metrology to supplier metrology.

Operational Excellence

Operational Excellence

Measurement System AnalysisMetrology Correlation and Matching1/28/2017Ronald Morgan Shewchuk61Metrology matching analysis is conducted when the two metrology tools are required to deliver exactly the same output. This is a typical condition where a specification exists for a critical quality characteristic.Before conducting metrology correlation and matching there are some prerequisites. You must ensure metrologies are accurate, capable, and stable. This means that the two measurement systems under consideration must have passed the success criterion for instrument detection limit, method detection limit, accuracy, linearity, Gage R&R and long term stability. Correlation and matching is most likely to be successful if the measurement procedures are standardized. Select a minimum of sixteen samples to be measured on both metrology tools. Samples should be selected such that they span the measurement range of interest (for example the spec range).

Operational Excellence

Operational Excellence

Measurement System AnalysisMetrology Correlation and Matching1/28/2017Ronald Morgan Shewchuk62Avoid clustered samples around a certain measurement value. If necessary, manufacture samples to cover the spec range. It is acceptable to include out of spec high and low samples.In order for two measurement systems to be correlated, R-squared of the least squares regression line of the current instrument vs the proposed instrument must be 75% or higher. If matching is desired, there are two additional requirements; the 95% confidence interval of the slope of the orthogonal regression line must include a slope of 1.0 and a paired t-Test passes (ie 95% confidence interval of mean includes zero). This ensures that bias between the two instruments is not significant.Let us revisit Penelope Banks at Minnesota Polymer to better understand metrology correlation and matching protocol. Penny has requisitioned a redundant XRF-EDS to serve as a critical back-up to the existing XRF-EDS instrument and to provide analysis capacity expansion for the future

Operational Excellence

Operational Excellence

Measurement System AnalysisMetrology Correlation and Matching1/28/2017Ronald Morgan Shewchuk63She has been submitting samples for analysis to both instruments for the last sixteen weeks and has collected the following results. Please refer to Figure 7.20 for correlation and matching analysis steps.Sample No.XRF-EDS1XRF-EDS2160403-2359D14.214.4160410-1600A15.315.1160414-0200B13.713.5160421-1400C16.817.0160427-0830C13.513.3160504-0300D15.115.1160510-1030A13.313.2160518-0100B16.416.2160525-1615C16.616.5160601-2330D14.314.5160608-0500D15.715.9160616-1330A13.813.6160625-1515C15.715.8160630-0420D16.216.0160707-2230B13.513.7160715-1920B16.817.0

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk64Figure 7.20 Metrology Correlation and Matching StepsOpen a new worksheet. Copy and paste the measurement data from the two instruments into the worksheet.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk65Figure 7.20 Metrology Correlation and Matching StepsClick on Graph Scatterplot on the top menu.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk66Figure 7.20 Metrology Correlation and Matching StepsSelect With Regression in the dialogue box. Click OK.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk67Figure 7.20 Metrology Correlation and Matching StepsSelect the reference instrument XRF-EDS1 for the X variables and XRF-EDS2 for the Y variables. Click OK.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk68Figure 7.20 Metrology Correlation and Matching StepsA scatter plot is produced with least squares regression line.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk69Figure 7.20 Metrology Correlation and Matching StepsHover your cursor over the least squares regression line. The R-sq = 98.1%. Correlation is good.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk70Figure 7.20 Metrology Correlation and Matching StepsReturn to the worksheet. Click on Stat Regression Orthogonal Regression on the top menu.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk71Figure 7.20 Metrology Correlation and Matching StepsSelect the reference instrument XRF-EDS2 for the Response (Y) and XRF-EDS1 for the Predictor (X) variables. Click Options.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk72Figure 7.20 Metrology Correlation and Matching StepsSelect 95 for the Confidence level. Click OK then click OK one more time.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk73Figure 7.20 Metrology Correlation and Matching StepsA scatter plot is produced with orthogonal regression line.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk74Figure 7.20 Metrology Correlation and Matching StepsClick on Window Session on the top menu. The session window indicates that the 95% Confidence Interval of the slope includes 1.0. The two instruments are linear in accuracy.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk75Figure 7.20 Metrology Correlation and Matching StepsReturn to the worksheet. Click on Stat Basic Statistics Paired t on the top menu.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk76Figure 7.20 Metrology Correlation and Matching StepsSelect XRF-EDS1 for Sample 1 and XRF-EDS2 for Sample 2 in the dialogue box. Click Options.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk77Figure 7.20 Metrology Correlation and Matching StepsSelect 95.0 for Confidence level. Select 0.0 for Hypothesized difference. Select Difference hypothesized difference for Alternative hypothesis in the dialogue box. Click OK. Then click OK one more time.

Operational Excellence

Operational Excellence

Measurement System Analysis1/28/2017Ronald Morgan Shewchuk78Figure 7.20 Metrology Correlation and Matching StepsThe session window indicates that the 95% confidence interval for the mean difference includes zero. The P-Value for the paired t-Test is above the significance level of 0.05. Therefore we may not reject the null hypothesis. There is no significant bias between the two instruments.

Penelope has proven that XRF-EDS2 is correlated and matched to XRF-EDS1. She may now use XRF-EDS2 for commercial shipment releases including Certificates of Analysis to her customers.

Operational Excellence

Operational Excellence

Measurement System AnalysisReferences1/28/2017Ronald Morgan Shewchuk79Warner, Kent. Martinich, Dave. Wenz, Paul., Measurement Capability and Correlation, Revision 4.0.2, Intel, Santa Clara, CA, 2010AIAG, Measurement Systems Analysis, Fourth Edition, Automotive Industry Action Group., Southfield, MI, 2010 Breyfogle, Forrest W., III., Implementing Six Sigma, Second Edition, John Wiley & Sons, Hoboken, NJ, 2003 George, M., Maxey, P., Price, M. and Rowlands, D., The Lean Six Sigma Pocket Toolbook, McGraw-Hill, New York, NY, 2005Wedgwood, Ian D., Lean Sigma A Practitioners Guide, Prentice Hall, Boston, MA 200740 CFR Part 136: Guidelines Establishing Test Procedures for the Analysis of Pollutants, Appendix B, Environmental Protection Agency, Washington, DC, 2012

Operational Excellence

Operational Excellence

Measurement System AnalysisInternet Resources1/28/2017Ronald Morgan Shewchuk80Automotive Industry Action GroupAutomotive Industry Action Group

Method Detection Limit (MDL) Calculators Method Detection Limit (MDL) Calculators | CHEMIASOFT

40 CFR Part 136: Guidelines Establishing Test Procedures for the Analysis of Pollutants, Appendix B40 CFR Part 136, Subchapter D

Percentiles of the t-Distributionhttp://sites.stat.psu.edu/~mga/401/tables/t.pdf

Operational Excellence

Operational Excellence

Measurement System Analysis