chapter 1 - introduction and basic concepts

8
Basic Dimensional Metrology Mani Maran Ratnam (2009) 1 Chapter 1. Introduction and Basic Concepts 1.1 Definition of Metrology Metrology is defined as the science of measurement. The measurement is based on units and standards that have been agreed at an international level. Measurement is the language used by scientists and engineers to communicate about size, quantity, position, condition and time. In general, metrology concerns the measurement of various types of physical quantities such as temperature, length, mass, pressure, flow rate, current, voltage, velocity, acceleration, strain etc. The aspect of metrology discussed in this book concerns the measurement of dimensions, hence called dimensional metrology. Dimensional metrology concerns the measurement of length, angle, thickness, diameter, straightness, flatness, parallelism, perpendicularity and surface form. Metrology is not just limited to the measurement of physical quantities mentioned above. It also involves industrial inspection and all the techniques relating to it, such as statistical data analysis, quality control etc. Inspection practiced in the industries normally involves inspection of a product at each stage of manufacture up to the final product. The inspection is carried out using various gages and measuring instruments. The duty of a metrologist not only involves the use of various methods of measurement to obtain the degree of accuracy desired, but also involves the aspects of design, construction and testing of various types of gages and measuring instruments. 1.2 The Need for Measurement and Inspection Measurements are made for three main reasons. Firstly, measurements are made to make a product, regardless of whether the product is made by us or by someone else. Secondly, measurements are made to control the way others make the product. This applies to simple things such as when ordering a window panel or as complex as producing millions of bolts and nuts. Thirdly, measurements are needed for scientific description. It is impossible to convey definite information about something without measurement, for example the temperature in a furnace or a speed of an object. Inspection is carried out to control the quality of products. In modern manufacturing, the production process of most products, such as cars, computers, electrical appliances etc., is broken down into the production of the individual components that make up the product. These processes may be carried out in the same factory or in different factories. All the components are then gathered and assembled to produce the final product. Therefore, if any two components are selected at random it should be possible to assemble them easily and at the same time meet the product specifications. The dimensions on each component of the final product must be within the range of dimensions set earlier to enable the assembly of the product easily. This requires inspection of the dimensions. Thus, dimensional measurement and inspecction play important roles in the quality control of a product. This is illustrated in Figure 1.1. Quality Inspection Customer requirement/ specification Product Design Manufacture Information from measurement Figure 1.1. Manufacturing and inspection cycle.

Upload: yeevon-lee

Post on 16-Jan-2016

12 views

Category:

Documents


0 download

DESCRIPTION

Metrology

TRANSCRIPT

Page 1: Chapter 1 - Introduction and Basic Concepts

Basic Dimensional Metrology

Mani Maran Ratnam (2009)

1

Chapter 1. Introduction and Basic Concepts 1.1 Definition of Metrology

Metrology is defined as the science of measurement. The measurement is based on units and

standards that have been agreed at an international level. Measurement is the language used by scientists and engineers to communicate about size, quantity, position, condition and time. In general, metrology concerns the measurement of various types of physical quantities such as temperature, length, mass, pressure, flow rate, current, voltage, velocity, acceleration, strain etc. The aspect of metrology discussed in this book concerns the measurement of dimensions, hence called dimensional metrology. Dimensional metrology concerns the measurement of length, angle, thickness, diameter, straightness, flatness, parallelism, perpendicularity and surface form.

Metrology is not just limited to the measurement of physical quantities mentioned above. It also involves industrial inspection and all the techniques relating to it, such as statistical data analysis, quality control etc. Inspection practiced in the industries normally involves inspection of a product at each stage of manufacture up to the final product. The inspection is carried out using various gages and measuring instruments. The duty of a metrologist not only involves the use of various methods of measurement to obtain the degree of accuracy desired, but also involves the aspects of design, construction and testing of various types of gages and measuring instruments. 1.2 The Need for Measurement and Inspection

Measurements are made for three main reasons. Firstly, measurements are made to make a

product, regardless of whether the product is made by us or by someone else. Secondly, measurements are made to control the way others make the product. This applies to simple things such as when ordering a window panel or as complex as producing millions of bolts and nuts. Thirdly, measurements are needed for scientific description. It is impossible to convey definite information about something without measurement, for example the temperature in a furnace or a speed of an object.

Inspection is carried out to control the quality of products. In modern manufacturing, the production process of most products, such as cars, computers, electrical appliances etc., is broken down into the production of the individual components that make up the product. These processes may be carried out in the same factory or in different factories. All the components are then gathered and assembled to produce the final product. Therefore, if any two components are selected at random it should be possible to assemble them easily and at the same time meet the product specifications. The dimensions on each component of the final product must be within the range of dimensions set earlier to enable the assembly of the product easily. This requires inspection of the dimensions. Thus, dimensional measurement and inspecction play important roles in the quality control of a product. This is illustrated in Figure 1.1.

Quality Inspection

Customer requirement/ specification

Product Design Manufacture

Information from measurement

Figure 1.1. Manufacturing and inspection cycle.

Page 2: Chapter 1 - Introduction and Basic Concepts

Basic Dimensional Metrology

Mani Maran Ratnam (2009)

2

The objective of dimensional control in any manufacturing process is not to achieve the exact dimension for a component. This is because it is not possible to produce all components having exactly the same dimensions. Random errors, such as due to vibration, present in the machine used to produce the component will cause random fluctuations in the dimensions and limit the dimensional accuracy that can be achieved on each product. Thus, the main objective of measurement is to control the dimensions of a component so that these dimensions are within the limits defined by the designer of the component.

Dimensional inspection will also assist in the process of design and development of modern measuring instruments that have high precision, accuracy and stability, such as those shown in Figure 1.2. These instruments will enable a manufacturing industry to move from the use of less precision machines to more precision ones. Dimensional inspection will also enable the production of high quality products using sophisticated and state-of-the-art processing techniques.

1.3 Calibration

Calibration is the comparison of the measuring instrument’s reading with the reading of a

standard instrument when the same input is given to both. Calibration is necessary after the instrument is used for a long time because the reading can change over time due to several factors, such as deterioration of the internal components of the instrument. Calibration can be carried out in-house or the instruments can be sent to specialized metrology laboratories that carry out such services. 1.4 Precision, accuracy and error

Two very important terms in the science of measurement are precision and accuracy. Precision of

a measuring instrument is the degree of repeatability of the readings when measurement of the same standard is repeated. Meanwhile, accuracy is the degree of closeness of the measured values with the actual dimension.

The difference between precision and accuracy becomes clear when we refer to the example shown in Figure 1.3(a)-(c). Assume that we take six measurements on a standard block of dimension 20.00 mm using three different instruments. In Figure 1.3(a) the readings taken have small deviation from one another, but the mean reading deviates greatly from the actual dimension of 20.00 mm. This illustrates an instrument having high precision but low accuracy. Figure 1.3(b) shows the characteristics of an instrument having low precision but high accuracy. The precision is low because the readings deviate greatly from one another, but the accuracy is high because the mean value is close to the actual value. Figure 1.3(c) shows a case where the instrument has both high

Digital micrometer

Digital height gage Profile projector

Figure 1.2. Modern measuring instruments. (Source: TESA)

Page 3: Chapter 1 - Introduction and Basic Concepts

Basic Dimensional Metrology

Mani Maran Ratnam (2009)

3

precision and high accuracy. The readings show little deviation from one another and the mean value is close to the actual dimension. A good instrument should have both high precision and high accuracy.

We have to realize that the concept of precision involves a set of measurement and not a single measurement. In any set of measurement the readings will be scattered about the mean value. The closeness of each reading with the other readings indicates the precision of the instrument.

The examples shown in Figure 1.3(a)-(c) indicate that the precision of an instrument can be determined by finding how much the measurements deviate from one another. The common statistical parameter known as standard deviation can be used to assess the precision of an instrument. A large standard deviation indicate an instrument with low precision, while a small standard deviation show an instrument having high precision. Notice that when the precision of an instrument is being assessed, no comparison is made with the actual dimension. For some instruments, such as the dial comparator, precision is more important than accuracy because the measurement is always made by comparing with a known standard.

Accuracy, on the other hand, refers to the correction that needs to be made to the instrument reading in order to obtain the actual value. In practice, calibration of the instrument during the manufacture of the instrument can reduce the correction to be made. However, after long time of use the operating mechanism in the instrument will deteriorate and re-calibration is required to retain the original accuracy.

The term error shown in Figure 1.3 is the difference between the mean of the readings and the actual dimension, i.e.

Error = measured value – actual value

It is common to take several measurements using an instrument and calculate the mean

value. The actual value is then subtracted from the mean value to determine the error. The error can take positive or negative values. Positive error means that the measured value is higher than the actual value, while negative error implies that the measured value is less than the actual value. When the error is low the instrument is said to have high accuracy and vice-versa.

There are several differences between error and accuracy. These are summarized in Table

1.1. The accuracy of an instrument is expressed with positive and negative values, e.g. ± 0.005 mm. The actual measurement obtained using the instrument lies anywhere between the lower and upper

limits. For instance, if an dial caliper has an accuracy of ± 0.005 mm and the reading shown in 0.020 mm, the measured value can lie anywhere in between 0.015 and 0.025 mm.

Table 1.1. Difference between error and accuracy. Error Accuracy

Associated with a single measuremen

Associated with the instrument

Known only after measurement

Known before the measurement

Either positive or negative

Has both positive and negative signs

Can be reduced by careful measurement

Intrinsic part of an instrument

Page 4: Chapter 1 - Introduction and Basic Concepts

Basic Dimensional Metrology

Mani Maran Ratnam (2009)

4

1.5 Relationship between accuracy and instrument cost

The basic objective of metrology is to design a measurement system of high accuracy at the

lowest possible cost. In general, the cost of an instrument increases exponentially with the accuracy required, i.e. an . The relationship between cost and accuracy can be shown in Figure 1.4.

22.02 mm 21.12 mm 22.32 mm 22.95 mm 22.41 mm 21.02 mm

×××× ××××

×××× ×××× ×××× ×××× Mean value

Actual value

Reading number

Readings taken

Error

20.00 mm

××××

××××

×××× ××××

×××× ××××

Mean value

Actual value

Reading number

Readings taken

Error

22.02 mm 19.32 mm 22.12 mm 23.95 mm 19.20 mm 18.90 mm

20.00 mm

×××× ××××

×××× ×××× ×××× ××××

Mean value

Actual value

Reading number

Readings taken

Error

21.15 mm 19.30 mm 20.22 mm 20.85 mm 21.05 mm 19.90 mm

20.00 mm

(a)

(b)

(c)

Figure 1.3. Difference between precision and accuracy.

High precision Low accuracy

Low precision High accuracy

High precision High accuracy

Page 5: Chapter 1 - Introduction and Basic Concepts

Basic Dimensional Metrology

Mani Maran Ratnam (2009)

5

High instrument accuracy can be obtained if the sources of errors caused by the following

elements can be reduced or eliminated: (a) Calibration standard

Examples of factors affecting calibration standards are environmental effects, stability with respect to time and elastic properties of materials. The most serious environmental effect is difference in temperature between the environment where the instrument is used and the standard temperature at which the instrument was calibrated. The calibration standard should also be stable with respect to time, i.e. no changes in dimensions should occur although the standard has been used for a long time. Elastic properties especially in length standards that can deflect under its own weight must be considered in accurate measurement. (b) Workpiece being measured

Among the most important factors that influence the workpiece are environmental factors such as temperature, condition of the workpiece surface and elastic properties of the workpiece material. Workpiece surface that is not perfectly flat or dirty will influence the measurement taken. If the workpiece is made from a material that has high elasticity there is possibility that the workpiece will deflect due to the pressure applied by the measuring instrument. The deflection will affect the readings taken. This is known as loading error. (c) Measuring instrument

The measuring instrument can be influenced by hysteresis effects, backlash, friction, zero drift error, inadequate amplification and calibration errors. Hysteresis effect can be observed if the reading shown by the instrument for a given input shows a difference depending on whether the reading was reached in increasing or decreasing order. The result is a hysteresis loop that represents measurement error. Hysteresis error is the maximum separation of the increasing and decreasing readings as illustrated in Figure 1.6.

Backlash error usually occurs in instruments that use a screw mechanism to move a contacting point (stylus), such as a micrometer. The screw rotates slightly before moving the internal parts and the stylus. This small rotation causes error in the measurement. Zero drift error refers to the zero error that occurs when an instrument is used for a long time.

Figure 1.4. Relationship between accuracy and cost.

Accuracy

Cost

Page 6: Chapter 1 - Introduction and Basic Concepts

Basic Dimensional Metrology

Mani Maran Ratnam (2009)

6

(d) Inspector taking the measurement

Errors caused by the inspector are usually due to lack of practice in the operation and use of instruments, choice of inappropriate instruments and attitude of not appreciating the importance of accuracy in a measurement process. (e) Environmental factors

Environmental factors that influence measurement accuracy are usually caused by temperature fluctuations, thermal expansion due to heat from lamps, vibration in the work place or pressure gradients. 1.6 Resolution and accuracy

Resolution of an instrument refers to the smallest dimension that can be read using the

instrument. It is given by the smallest graduation on the instrument’s scale (Figure 1.7). The resolution of some instruments, such as dial gage, depends on the distance between two graduations and the width of the needle. If the width of the needle is large compared to the distance between two graduations the instrument will have a low resolution. This is because a small movement of the needle will not produce any noticeable change in the reading.

Sensitivity of a measuring instrument is defined as the minimum input that will produce a detectable output. Sensitivity depends on the amplification present in the instrument. Normally, high sensitivity is needed to enable measurement with high precision. When the sensitivity and accuracy of a measuring instrument is high its reliability is also high.

Figure 1.6 Hysteresis error.

Readings

Value

Increasing

Decreasing

HysteresisError

Page 7: Chapter 1 - Introduction and Basic Concepts

Basic Dimensional Metrology

Mani Maran Ratnam (2009)

7

1.7 Repeatability, reproducibility and uncertainty

The repeatability of a measuring instrument is defined as the closeness between the readings

obtained from repeated measurement of the same quantity (or input) carried out under the same conditions. These conditions are known as repeatability conditions. Examples of repeatability conditions are measurement procedure, operator making the measurement, measuring instrument used, location where measurement is carried out and time duration between measurements. The time duration between measurements must be short so that instrument does not deteriorate with time, such as undergo zero drift.

Reproducibility of an instrument is defined as the closeness between the readings obtained from repeated measurements of the same quantity but under different conditions. Statement of reproducibility requires specification of the condition that is changed. The condition that is changed may be the person making the measurement, the place where measurement is made, the measurement procedure etc.

Uncertainty is defined as a parameter that characterizes the dispersion of the readings taken. In statistical analysis of the data this parameter may be the standard deviation of the readings. Expression of uncertainty in given in more detail in Reference (1). 1.8 Tolerance

Tolerance is defined as the variation in the dimension allowed in a component. Unlike the

other terms learned in the foregoing sections, tolerance is normally applied to the product and not the measuring instrument. For instance, if the nominal size of a component is 20 mm and the manufacturing process produces parts that have a maximum size of 20.1 mm and a minimum size

of 19.9 mm, thus the tolerance of the part component is ±0.1 mm. Tolerances like this are unavoidable in the manufacturing process because of the difficulty in producing any two parts to the exact dimensions. Close tolerance normally requires precise machining and hence the manufacturing cost increases. Therefore, close tolerance must be avoided unless it is required in order to ensure that the component functions as intended.

Resolution = 0.01 in

Figure 1.7. Resolution of a dial indicator.

Page 8: Chapter 1 - Introduction and Basic Concepts

Basic Dimensional Metrology

Mani Maran Ratnam (2009)

8

Review Questions Question 1.1 Figure below shows two types of dial gages. Which of statements given best describes the differences between these gages?

I. Precision of A is higher than precision of B ? II. Accuracy of A is higher than accuracy of B ? III. Sensitivity of A is higher than sensitivity of B ? IV. Resolution of A is higher than resolution of B ?

Question 1.2 The precision and accuracies of three digital calipers A, B and C were compared by calculating the mean and standard deviation of a set of readings taken using each caliper. Each set of reading was recorded by repeating the measurement on a 20mm block gage five times. The readings are tabulated as follows:

Dial caliper Readings (mm)

A 20.005, 20.080, 19.995, 20.016, 20.055 B 20.151, 20.155, 20.149, 20.145, 20.160 C 19.233, 19.440, 20.002, 21.024, 19.028

Determine which dial gage has

(i) the highest accuracy (ii) the highest precision (iii) the lowest accuracy (iv) the lowest precision

Use the statistical functions on your calculator or manual calculation using the given formulae.

[Given that for a set of readings x1, x2, x3 …… xn,, the mean and standard deviation are given by

n

xxxxx n++++=

..........321 , ( )

1

2

−=∑n

xxiσ ]

Gage A Gage B