When you make a measurement, it is important to know how accurate it is. Use relative error to express accuracy as a relative quantity.
Step 1: Understand relative error Understand that relative error is an indication of how accurate a measurement is relative to the size of what is being measured.
Step 2: Make a measurement Make a measurement. You might, for example, decide to measure the length of an object or its weight.
Step 3: Determine accuracy Determine how accurate your measurement is. For example, if you use a meter stick to measure an object's length, you might reasonably estimate the accuracy of the measurement to be plus or minus 1 millimeter. Accuracy can be converted to absolute error.
Step 4: Calculate absolute error Convert accuracy units to units of absolute error by making sure the measurement and accuracy units are consistent.
TIP: For example, if an object determined to be 1 meter long is measured to within an accuracy of plus or minus 1 millimeter, the absolute error 1 millimeter times 1 meter divided by 1000 millimeter or .001 meter.
Step 5: Calculate relative error Calculate the relative error by dividing the absolute error by the measured value of the object. In the example given, the relative error is .001 meter divided by 1 meter or .001. It's that easy!
FACT: The biblical cubit was the length of a man's forearm or the distance from the tip of the elbow to the end of his middle finger.