# Accuracy and precision

Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
Citable Version  [?]

This editable Main Article is under development and subject to a disclaimer.

Accuracy and precision are often used interchangeably, but they do have different engineering definitions. Indeed, without additional terminology, they may be considered only as qualitative.

In common use, however, it may be easier to understand if one thinks of several people, at varying levels of sobriety, playing darts at a bar. The first person steps up, throws, and puts four darts into the dead center of the target, all so close together that a card could not slide between them. She is both accurate and precise.

The second person throws the first dart, and the room goes dark. He continues to throw, and when the bartender turns on a flashlight, it reveals that the darts missed the target completely, but riddled the light switch, and causing the circuit breaker to trip when the wires shorted together. He was precise but not accurate.

The third dart-thrower thows, putting one dart above the target, transfixing a potted fern with another, sliding one down the neck of a beer bottle lifted by a now-terrified patron, and finishing the performance by shattering a bottle of very expensive brandy. She is neither accurate nor precise.

The fourth member of the team hits the target, with the four darts at the extreme top, left side, bottom, and right side of the center of the target, every dart just making the maximum score. He is accurate, but not precise.

## Accuracy

According to the U.S. National Institute of Standards and Technology, the term accuracy is qualitative. The proper term is accuracy of measurement, defined as the "closeness of the agreement between the result of a measurement and the value of the measurand". The measurement is the output of the instrument, and the measurand is its true value. [1]

Again in formal sense, the use of a numerical value for accuracy is deprecated; a more appropriate term is a measures of uncertainty. It is correct to speak of a standard uncertainty of 3.4 units, but not "the accuracy is 3.4 units."

### Repeatability (of results of measurements)

Repeatability is closeness of the agreement between the results of successive measurements of the same measurand carried out under the same conditions of measurement. The repeatability conditions include the procedure with which the measurement done, the person or device controlling the measurement, the measuring instrument, the same place at which the mesurements are made, and the measurements being made over a short period of time.

To quantify repeatability, use standard measures of statistical dispersion.

### Reproducibility (of results of measurements)

Reproducibility is subtly different from the extremely controlled conditions of repeatability. In this case, the value of interest is of measurements of the same measurand carried out under changed conditions of measurement. Hypothetically, one might measure temperature with a thermometer using the height of a column of liquid, with a thermistor or other temperature-sensitive electronic component, or by the curvature of a bimetallic strip. If any of the instrument displays are analog rather than digital, the same instrument should be read by different observers.

### Error (of measurement)

Error of measurement is the value of the measurement minus the true measurand value.

This principle can be extended to determining the error of the device, when a reference standard of known value is measured under repeated and consistent conditions.

## Precision

While the use of "precision" is deprecated by the ISO International Vocabulary of Basic and General Terms in Metrology, one international definition is "the closeness of agreement between independent test results obtained under stipulated conditions." This definition encompasses repeatability, as "precision under repeatability conditions," and reproducibility as "precision under reproducibility conditions." Commonly, precision simply means repeatabiliy.[2]

False precision is a refinement of a measurement to a point that is known to be impossible with the instrumentation available, and often creeps in when computers are not limited in the number of decimal places they show. A digital wristwatch reading 3:37 may well be precise. Unless it was synchronized with a time reference (e.g., GPS), if its display showed 3:37:09.1234, everything after the minutes may be false precision.

## Usage in specific fields

### Military

With respect to precision-guided munitions, a military convention has emerged in which an "accurate" weapon hits within 10 meters of its desired impact point, while a "precision" weapon hits within 3 meters.[3]

## References

1. Taylor, Barry N. & Chris E. Kuyatt (1994), D: Clarification and Additional Guidance, Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results, U.S. National Institute of Standards and Technology, NIST Technical Note 1297
2. International Organization for Standardization (1993), Statistics - Vocabulary and symbols - Part 1: Probability and general statistical terms, ISO 3534-1:1993
3. Sine, Jack (Spring 2006), "Defining the “Precision Weapon” in Effects-Based Terms", Air & Space Power Journal