These are not exactly as I would describe it -
Accuracy - the degree to which the measured value matches the true value
Precision - the resolution of measurement - number of significant digits -
precision must be greater than or equal to accuracy
repeatability - the degree to which successive measurements of the same
exact object return the same value
so, my understanding of precision seems to differ from yours
Ok, here is what NIST (National Institute of Standards and Technology)
has to say about it, and I would tend to put more faith in their
interpretation than other Usenet sources.
D.1.1.1 accuracy of measurement [VIM 3.5] closeness of the agreement
between the result of a measurement and the value of the measurand
"Accuracy" is a qualitative concept.
The term "precision" should not be used for "accuracy."
TN 1297 Comments:
1. The phrase "a true value of the measurand" (or sometimes simply "a
true value"), which is used in the VIM definition of this and other
terms, has been replaced here and elsewhere with the phrase "the value
of the measurand." This has been done to reflect the view of the Guide,
which we share, that "a true value of a measurand" is simply the value
of the measurand. (See subclause D.3.5 of the Guide for further
2. Because "accuracy" is a qualitative concept, one should not use it
quantitatively, that is, associate numbers with it; numbers should be
associated with measures of uncertainty instead. Thus one may write
"the standard uncertainty is 2 µΩ" but not "the accuracy is 2 µΩ."
4. The VIM does not give a definition for "precision" because of the
many definitions that exist for this word. For a discussion of
precision, see subsection D.1.2.
D.1.1.2 repeatability (of results of measurements) [VIM 3.6] closeness
of the agreement between the results of successive measurements of the
same measurand carried out under the same conditions of measurement.
D.1.1.3 reproducibility (of results of measurements) [VIM 3.7]
closeness of the agreement between the results of measurements of the
same measurand carried out under changed conditions of measurement
D.1.2 As indicated in subsection D.1.1.1, TN 1297 comment 4, the VIM
does not give a definition for the word "precision." However, ISO
3534-1 [D.2] defines precision to mean "the closeness of agreement
between independent test results obtained under stipulated conditions."
Further, it views the concept of precision as encompassing both
repeatability and reproducibility (see subsections D.1.1.2 and D.1.1.3)
since it defines repeatability as "precision under repeatability
conditions," and reproducibility as "precision under reproducibility
conditions." Nevertheless, precision is often taken to mean simply
================================================================== Note the very last sentence.
Well let's take a look at what some other sources have to say.
If you repeat a measurement several times on the same parameter over
the period of measurement, you may get a series of readings that differ
from each other. The cause may be small differences in how you use the
instrument each time. The differences could also be due to random
changes in the instrument, and they could be due to small changes in
the parameter you are measuring. Whatever the cause, you would be
inclined to take the average of the readings as the best value you can
quote or use. You can get an idea of the variability from the range of
values obtained, i.e. the difference between the largest and smallest
reading, but a better measure is likely to be the variance of the
readings. Variance, var, is a statistical measure obtained by calculating:
Accurate means "capable of providing a correct reading or
measurement." In physical science it means 'correct'. A measurement is
accurate if it correctly reflects the size of the thing being measured.
Precise means "exact, as in performance, execution, or amount. "In
physical science it means "repeatable, reliable, getting the same
measurement each time."
Accuracy refers to the agreement between a measurement and the true or
correct value. If a clock strikes twelve when the sun is exactly
overhead, the clock is said to be accurate. The measurement of the
clock (twelve) and the phenomena it is meant to measure (The sun
located at zenith) are in agreement. Accuracy cannot be discussed
meaningfully unless the true value is known or is knowable. (Note: The
true value of a measurement can never be known. Read more about this.)
Accuracy refers to the agreement of the measurement and the true value
and does not tell you about the quality of the instrument. The
instrument may be of high quality and still disagree with the true
value. In the example above it was assumed that the purpose of the
clock is to measure the location of the sun as it appears to move
across the sky. However, in our system of time zones the sun is
directly overhead at twelve O'clock only if you are at the center of
the time zone. If you are at the eastern edge of the time zone the sun
is directly overhead around 11:30, while at the western edge the sun is
directly overhead at around 12:30. So at either edge the twelve O'clock
reading does not agree with the phenomena of the sun being at the local
zenith and we might complain that the clock is not accurate. Here the
accuracy of the clock reading is affected by our system of time zones
rather than by any defect of the clock.
In the case of time zones however clocks measure something slightly
more abstract than the location of the sun. We define the clock at the
center of the time zone to be correct if it matches the sun, we then
define all the other clocks in that time zone to be correct if they
match the central clock. Thus a clock at the Eastern edge of a time
zone that reads 11:30 when the sun is overhead would still be accurate
since it agrees with the central clock. A clock that read 12:00 would
not be accurate at that time. The idea to get used to here is that
accuracy only refers to the agreement between the measured value and
the expected value and that this may or may not say something about the
quality of the measuring instrument. A stopped clock is accurate at
least once each day.
Precision refers to the repeatability of measurement. It does not
require us to know the correct or true value. If each day for several
years a clock reads exactly 10:17 AM when the sun is at the zenith,
this clock is very precise. Since there are more than thirty million
seconds in a year this device is more precise than one part in one
million! That is a very fine clock indeed! You should take note here
that we do not need to consider the complications of edges of time
zones to decide that this is a good clock. The true meaning of noon is
not important because we only care that the clock is giving a
The precision of an instrument reflects the number of significant
digits in a reading;
The accuracy of an instrument reflects how close the reading is to the
'true' value measured.
====================================================================== This last one above seems to support your perspective.
well, it seems that there is no universal agreement, and further, that the
meaning differs if you are refering to measurments or to something else.
you can say the time is precisely 3:00 - that has a resolution of one
minute, a precision of one minute, but if the actual time is 6:35, its
accuracy is only 3.5 hours - this is illustrated by the classic problem of
excess decimal places.
Accuracy refers to the closeness between measurements
and their expectations ("true" values). The farther a measurement is
from its expected value, the less accurate it is.
Precision pertains to the closeness to one another of a set
repeated observations of a random variable. Thus, if such
are closely clustered together, then these observations are
to have been obtained with high precision.
Well, from subsequent comments, that may be good English, but to us
Yobbo high school graduates, it requires too much thought to
From my (admittedly limited) engineering trade lessons, what I was
Accuracy - the ability to measure or layout to the tolerances needed
for the job.
Precision - the ability to be able to work to those measurements and
maybe my memory is fading, but I seem to recall, back in the good old days
of slide rules, that we used the word "precision" to mean "Resolutoin" as
you define it - I tend to think of resolution as something lenses have, not
micrometers, but I can be comfortable with it meaning "precision"
Unfortunately, I have seen these two terms misused even in advertising
for otherwise good products. I think firms for which these terms are
important ought to have technical people approve their ads, but I
guess that is too much to hope for in today's business world.
And here I thought that the target illustration in BottleBob's first
post was clear and self-explanatory!
I'd used that analogy in my work measurement class and the students
seemed to grasp the idea.
Silly 'ol me!
Polytechforum.com is a website by engineers for engineers. It is not affiliated with any of manufacturers or vendors discussed here.
All logos and trade names are the property of their respective owners.