- Instrument error
Instrument error refers to the combined
accuracy and precision of ameasuring instrument , or the difference between the actual value and the value indicated by the instrument (error ). Measuring instruments are usually calibrated on some regular frequency against a standard. The most rigorous standard is one maintained by astandards organization such asNIST in theUnited States , or the ISO in European countries. However, in physics--precision, accuracy, and error are computed based upon the instrument and the measurement data. Precision is to 1/2 of the granularity of the instrument's measurement capability. Precision is limited to the number of significant digits of measuring capability of the coarsest instrument or constant in a sequence of measurements and computations. Error is +/- the granularity of the instrument's measurement capability. Error magnitudes are also added together when making multiple measurements for calculating a certain quantity. When making a calculation from a measurement to a specific number of significant digits, rounding (if needed) must be done properly. Accuracy might be determined by making multiple measurements of the same thing with the same instrument, and then calculating the result with a certaintype of math function, or it might mean for example, a five pound weight could be measured on a scale and then the difference between five pounds and the measured weight could be the accuracy. The second definition makes accuracy related to calibration, while the first definition does not.
Wikimedia Foundation. 2010.