1-concepts of measurement - 1

Post on 16-Jan-2016

220 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Concepts of Measurement

TRANSCRIPT

1

Fayoum University

Faculty of Engineering

Industrial Engineering Dept.

Dr. Emad M. SaadIndustrial Engineering Dept.

Faculty of EngineeringFayoum University

Concepts of Measurement

Lecture (1)

on

By

2014 - 2015

3

Lecture (1) – Mechanical Measurements – 2nd year – Industrial.

Course Syllabus

Corse Name: Mechanical Measurements – 2nd year – Industrial Dept.

Course Outline: 1. Engineering Measurements

2. Linear measurements

3. Measuring lengths

4. Measuring angles

5. Scale marginal

6. Comparison

7. Integrity and flatness tests and roughness and roundness

8. Measuring screw and gear

9. Overlap and optical devices overlap and its applications in metrology

10. Used in laser metrology

11. Three dimensions measurements

12. Measurement methods Computer Aided

4

Course Syllabus

Course prerequisites: 1. Mechanical Engineering Principles.

2. Physics Principles.

3. Statistics Principles.

Text Books: Lectures notes

References: Experimental Methods for Engineers - Eighth Edition - J.P. Holman

Evaluation: 1. Homework, attendance and assignments; equivalent 20%.

2. Mid exam; equivalent 20%.

3. Final exam; equivalent 60%.

Professor: Facebook: DrEmad Elasid

Email: ems03@fayoum.edu.eg

Office Hours: Thursday 10:00 - 15:30

Wednesday 09:00 - 15:30 or by Appointment

5

Technical Terms

MeasurementMeasurement is the act, or the result, of a quantitative comparisonbetween a predetermined standard and an unknown magnitude.

RangeIt represents the highest possible value that can be measured by an

Scale sensitivityIt is defined as the ratio of a change in scale reading to thecorresponding change in pointer deflection. It actually denotes thesmallest change in the measured variable to which an instrumentresponds.

True or actual valueIt is the actual magnitude of a signal input to a measuring system which can only be approached and never evaluated.

6

Technical Terms

AccuracyIt is defined as the closeness with which the reading approaches anaccepted standard value or true value.

PrecisionIt is the degree of reproducibility among several independentmeasurements of the same true value under specified conditions. It isusually expressed in terms of deviation in measurement.

RepeatabilityIt is defined as the closeness of agreement among the number ofconsecutive measurement of the output for the same value of inputunder the same operating conditions. It may be specified in terms ofunits for a given period of time.

7

Technical Terms

ReliabilityIt is the ability of a system to perform and maintain its function in routinecircumstances. Consistency of a set of measurements or measuring instrumentoften used to describe a test.

Systematic ErrorsA constant uniform deviation of the operation of an instrument is known assystematic error. Instrumentational error, environmental error, Systematic errorand observation error are systematic errors.

Random ErrorsSome errors result through the systematic and instrument errors are reduced orat least accounted for. The causes of such errors are unknown and hence, theerrors are called random errors.

CalibrationCalibration is the process of determining and adjusting an instruments accuracyto make sure its accuracy is within the manufacturer’s specifications.

8

Technical Terms

ReliabilityIt is the ability of a system to perform and maintain its function in routinecircumstances. Consistency of a set of measurements or measuring instrumentoften used to describe a test.

Systematic ErrorsA constant uniform deviation of the operation of an instrument is known assystematic error. Instrumentational error, environmental error, Systematic errorand observation error are systematic errors.

Random ErrorsSome errors result through the systematic and instrument errors are reduced orat least accounted for. The causes of such errors are unknown and hence, theerrors are called random errors.

CalibrationCalibration is the process of determining and adjusting an instruments accuracyto make sure its accuracy is within the manufacturer’s specifications.

9

1. Direct method

2. Indirect method

3. Absolute or Fundamental method

4. Comparative method

5. Transposition method

6. Coincidence method

7. Deflection method

8. Complementary method

9. Contact method

10. Contactless method

Methods of Measurements

10

Generalized Measurement System

11

Measurement Standards

1. Calibration standards: Working standards of

industrial or governmental laboratories.

1. Metrology standards: Reference standards

of industrial or Governmental laboratories.

1. National standards: It includes prototype

and natural phenomenon of SI (Systems

International),

12

Factors affecting the accuracy of the Measuring System

1. Standard

2. Work piece

3. The inherent characteristics of Instrument

4. Person

5. Environment

top related