26-09-2013, 03:06 PM
Basic Concepts of Measurements
Concepts of Measurements.doc (Size: 194 KB / Downloads: 72)
Need for Measurement
to ensure that the part to be measured conforms to the established standard.
to meet the interchangeability of manufacture.
to provide customer satisfaction by ensuring that no faulty product reaches the customers.
to coordinate the functions of quality control, production, procurement & other departments of the organization.
to judge the possibility of making some of the defective parts acceptable after minor repairs.
Precision & Accuracy of Measurement
Precision : It is the degree which determines how well identically performed measurements agree with each other. It is the repeatability of the measuring process. It carries no meaning for only one measurement. It exists only when a set of observations is gathered for the same quantity under identical conditions. In such a set, the observations will scatter about a mean. The less is the scattering, the more precise is the measurement.
Accuracy : It is the degree of agreement between the measured value and it’s true value. The difference between the measured value & the true value is known as ‘Error of measurement’. Accuracy is the quality of conformity.
To distinguish the Precision from Accuracy, the following simple example can be said. A repaired needle-watch will give Precision readings (same time) all the times, but will give Accurate readings (correct time) only 2 times in a day.
Of the two, Precision & Accuracy, only the former is required though the latter is usually sought for in a measuring process. Achieving high precision is easier & cheaper than achieving high accuracy. If the measuring instrument is of high precise & is calibrated for its error, then the true value can be easily obtained from the measured average value after deducting the instrument error. So, high precision - instrument is required rather than the high accurate – instrument, considering cost and reliability of the measuring instrument.
However, of the two, precision & accuracy, which one is more vital, depends on the situation. For example, for a carpenter entrusted with the job of fitting a shelf into cupboard, precision is more important. This can be achieved only when he uses the same scale to measure the cupboard & the board for shelf. It hardly matters whether his scale is accurate or not. If however, such a board is ordered for purchase from a pre-cut board from outside, accuracy becomes more vital than precision. He must measure the size of the cupboard very accurately before placing the order.
Reliability of Measurement
If a measuring instrument is not precise, it will give different values for same dimension, when measured again and again. Such an instrument thus is considered non-trust worthy. The first and fundamental requirement of any good measuring instrument to be effective is that it should have adequate repeatability or precision. The measuring instrument which gives precise (same) values all the times is far reliable than the instrument which gives accurate (true) values rarely but not precise values all the times. The precise value can be easily converted into accurate value by taking the constant error of precision instrument into account.
Terms in Measurement
1) Constant of a measuring instrument: The factor by which the indication of the instrument shall be multiplied to obtain the result of measurement.
2) Nominal value of a physical measure: The value of the quantity reproduced by the physical measure and is indicated on that measure.
3) Conventional true value of a physical measure: The value of the quantity reproduced by the physical measure, determined by a measurement carried out with the help of measuring instruments, which show a total error which is practically negligible.
4) Standard: It is the physical embodiment of a unit. For every kind of quantity to be measured, there should be a unit to express the result of the measurement & a standard to enable the measurement.
5) Calibration: It is the process of determining the values of the quantity being measured corresponding to a pre-established arbitrary scale. It is the measurement of measuring instrument. The quantity to be measured is the ‘input’ to the measuring instrument.
The ‘input’ affects some ‘parameter’ which is the ‘output’ & is read out. The amount of ‘output’ is governed by that of ‘input’. Before we can read any instrument, a ‘scale’ must be framed for the ‘output’ by successive application of some already standardised (inputs) signals. This process is known as ‘calibration’.
6) Sensitivity of instrument: The ability of the instrument to detect small variation in the input signal.
7) Readability of instrument: The susceptibility of a measuring instrument to having its indications converted to a meaningful number. It implies the ease with which observations can be made accurately.
Standards of Measurement
a) FPS System: In this system, the units of length, mass, time, temperature are Foot (or Yard), Pound (or Slug), Second, Rankine (or Fahrenheit) respectively. It is common in English speaking countries and is developed by Britain.
b) Metric System: It is a decimal system of weight & measurement is based on the Metre as the unit of length. It was first used in France. Its basic unit is Metre.
CGS prescribes Centimetre, Gram, Second for length, weight & time respectively.
MKS prescribes Metre, Kilogram, Second for length, weight & time respectively.
MKSA (Giorgi) system added Ampere, the unit of electrical current to MKS system.
c) SI system: In 1960, General Conference on Weights & Measures (CGPM) formally gave the MKSA, the title ‘’Systems International d’ unites’’ with the abbreviation ‘SI’ (also called as International System of units). In SI, the main departure from the traditional metric system is the use of ‘Newton’ as the unit of Force. India by Act of Parliament No.89, 1956 switched over to SI system.
Classification of Standards
1) Line & End Standards: In the Line standard, the length is the distance between the centres of engraved lines whereas in End standard, it is the distance between the end faces of the standard. Example : for Line standard is Measuring Scale, for End standard is Block gauge.
2) Primary, Secondary, Tertiary & Working Standards:
Primary standard: It is only one material standard and is preserved under the most careful conditions and is used only for comparison with Secondary standard.
Secondary standard: It is similar to Primary standard as nearly as possible and is distributed to a number of places for safe custody and is used for occasional comparison with Tertiary standards.
Tertiary standard: It is used for reference purposes in laboratories and workshops and is used for comparison with working standard.
Working standard: It is used daily in laboratories and workshops. Low grades of materials may be used.
Errors in Measurement
Error in measurement is the difference between the measured value and the true value of the measured dimension.
Error in measurement = Measured value – True value
The error in measurement may be expressed as an absolute error or as a relative error.
1) Absolute error: It is the algebraic difference between the measured value and the true value of the quantity measured. It is further classified as;
a) True absolute error: It is the algebraic difference between the measured average value and the conventional true value of the quantity measured.
b) Apparent absolute error: It is the algebraic difference between one of the measured values of the series of measurements and the arithmetic mean of all measured values in that series.
2) Relative error: It is the quotient of the absolute error and the value of comparison (which may be true value, conventional true value or arithmetic mean value of a series of measurements) used for the calculation of that absolute error.
Causes of Errors
1) Errors due to deflection (Errors of supports): When long bars are supported as beam, they get deformed or deflected. This elastic deformation occurs because long bars, supported as to ends sags under their own weight. The amount of deflection depends upon the positions of the supports. This problem was considered by Sir G.B. Airy, who showed that the positions of the supports can be arranged to give a minimum error. Slope and deflection at any point can be calculated from the theory of bending.
Two conditions are considered, as follows; For a bar of length L, supported equidistant from the centre on supports by distance ‘l’ apart, then for no slopes at the ends, l = 0.577 L (suitable for line standards and end bars). For minimum deflection of the beam, l = 0.544 L (suitable for straight edges)
2) Errors due to misalignment: Abbe’s principle of alignment should be followed in measurements to avoid cosine errors, sine errors, etc. According to Abbe’s principle, “the axis or line of measurement of the measured part should coincide with the line of measuring scale or the axis of measurement of the measuring instrument”.
The combined Cosine and Sine error occurs if the micrometer axis is not truly perpendicular to the axis of the work piece as shown below.