Approximation and Errors

Significant figures:

The term significant figures refers to the number of important single digits (0 through 9 inclusive) in the coefficient of an expression in scientific notation . The number of significant figures in an expression indicates the confidence or precision with which an engineer or scientist states a quantity.

The table shows several examples of numbers written in standard decimal notation (first column) and in scientific notation (second column). The third column shows the number of signficant figures in the corresponding expression in the second column.

Decimal expression Scientific notation Sig. figs.
1,222,000.00 1.222 x 10 6 4
1.22200000 x 10 6 9
0.00003450000 3.45 x 10 -5 3
3.450000 x 10 -5 7
-9,876,543,210 -9.87654 x 10 9 6
-9.876543210 x 10 9 10
-0.0000000100 -1 x 10 -8 1
-1.00 x 10 -8 3
Significant figures are arrived at by rounding off an expression after a calculation is executed. In any calculation, the number of significant figures in the solution must be equal to, or less than, the number of significant figures in the least precise expression or element. Consider the following product:
2.56 x 10 67 x -8.33 x 10 -54
To obtain the product of these two numbers, the coefficients are multiplied, and the powers of 10 are added. This produces the following result:
2.56 x (-8.33) x 10 67+(-54)
= 2.56 x (-8.33) x 10 67-54
= -21.3248 x 10 13

The proper form of common scientific notation requires that the absolute value of the coefficient be larger than 1 and less than 10. Thus, the coefficient in the above expression should be divided by 10 and the power of 10 increased by one, giving:
-2.13248 x 10 14
Because both multiplicands in the original product are specified to only three significant figures, a scientist or engineer will round off the final expression to three significant figures as well, yielding:
-2.13 x 10 14
as the product.

Accuracy and Precision:

Definition of Accuracy

There are two common definitions of the term accuracy. In math, science, and engineering, accuracy refers to how close a measurement is to the true value.
The ISO (International Organization for Standardization) applies a more rigid definition, where accuracy refers to a measurement with both true and consistent results. The ISO definition means an accurate measurement has no systematic error and no random error. Essentially, the ISO advises the term accurate be used when a measurement is both accurate and precise.

Definition of Precision

Precision is how consistent results are when measurements are repeated.


Precise values differ from each other because of random error, which is a form of observational error. 

Examples of Accuracy and Precision

You can think of accuracy and precision in terms of a basketball player. If the player always makes a basket, even though he strikes different portions of the rim, he has a high degree of accuracy.If he doesn't make many baskets, but always strikes the same portion of the rim, he has a high degree of precision. A player who throws free throws that always make the basket the exact same way has a high degree of both accuracy and precision.
Take experimental measurements for another example of precision and accuracy. If you take the measurements of the mass of a 50.0-gram standard sample and get values of 47.5, 47.6, 47.5, and 47.7 grams, your scale is precise, but not very accurate. If your scale gives you values of 49.8, 50.5, 51.0, 49.6, it is more accurate than the first balance, but not as precise. The more precise scale would be better to use in the lab, providing you made an adjustment for its error.

Mnemonic To Memorize the Difference

An easy way to remember the difference between accuracy and precision is:
  • ACcurate is Correct. (or Close to real value)
  • PRecise is Repeating. (or Repeatable)

Accuracy, Precision, and Calibration

Do you think it is better to use an instrument that records accurate measurements or one that records precise measurements? If you weigh yourself on a scale three times and each time the number is different, yet close to your true weight, the scale is accurate.
Yet, it might be better to use a scale that is precise, even if it is not accurate. In this case, all of the measurements would be very close to each other and "off" from the true value by about the same amount. This is a common issue with scales, which often have a "tare" button to zero them.
While scales and balances may allow you to tare or make an adjustment to make measurements both accurate and precise, many instruments require calibration. A good example is a thermometer. Thermometers often read more reliably within a certain range and give increasingly inaccurate (but not necessarily imprecise) values outside of that range. To calibrate an instrument, record how far off its measurements are from known or true values. Keep a record of the calibration to ensure proper readings. Many pieces of equipment require periodic calibration to ensure accurate and precise readings.

Different Types of Errors 

Post a Comment

Previous Post Next Post