On Thu, 11 Jan 2007 17:53:50 -0800, the renowned "Stupendous Man"
Resolution is how closely you can read it (resolve the reading). In
the case of a digital micrometer, a resolution of 0.00015 means that
the display shows increments of half of a tenth of a thou.
Typically that means you can detect changes of that amount
("repeatability"), at least with everything else constant
(temperature, battery condition etc.) but it is not guaranteed by that
OTOH, accuracy means that (under the specified conditions) you should
be able to glomp it onto an accurate gage block and it should read
within half a thou of the correct reading (usually give or take the
resolution, in the fine print, and typically with it just zero'd).
"it's the network..." "The Journey is the reward"
email@example.com Info for manufacturers: http://www.trexon.com
No, it measures differences in size (I think). You put in a known size
close to what you need (like gages), note the position of the dials,
then adjust the jaws to fit the part that you are measuring, and see
how much you had to move.
This thing weighs 110 lbs.
I'd hate to use it. Just heat from your hands while handling the frame will
knock it so far out of kilter the measurements would be useless. It takes a
lot of experience and a lot of patience to get accuracy from an instrument
That's an analog amplifier for the linear gage. The micrometer makes a nice
paper weight without it in working order. Actually it's still useful
without it functioning, but for repetitive high accuracy measurements, you
need it. You would set a master in the micrometer, then zero out the gage.
Now you can quickly take +/- comparitive measurements of workpieces.
With a functioning linear gage you can take comparative readings to
whatever resolution/accuracy/repeatability of the linear gage. Usually 20
millionths. But more accurate ones are available for retrofit.
At those resolutions the flatness and paralellism of the anvils is an
issue. You would need to check those with an optical flat and a
monochromatic light source.
Then of course you need a very expensive set of master gage blocks with
which to set it. Figure on those costing nearly the equivelent of the
micrometer itself if you are wanting to cover the 12" range. Plus they
degrade every time you use them, no matter how well you take care of them.
That reads motion of the anvil in whatever resolution is
selected by a switch on it -- or is just a fixed sensitivity. The
principle is that you set it to a standard, and zero the scale, and then
it will show you how far over or under the desired size you are. I
don't know the resolution on that one -- but it is probably good enough
to display an microinch of difference.
P.S. My news server seems to be getting better, but I'm still losing
some articles -- so if you really need to communicate with me,
send me e-mail. :-)
Email: < firstname.lastname@example.org> | Voice (all times): (703) 938-4564
(too) near Washington D.C. | http://www.d-and-d.com/dnichols/DoN.html
You have 3 main factors in metrology:
(There are more, of course)
Resolution is how many graduations your "metering stick" has.
Accuracy is how precise the graduation of your "metering stick" is.
repeatability are the bounds of reading values you get when you repeatetly
measure the same thing.
An pictorial example:
Take a rubber band and draw lines at it every inch by eyeballing:
The resolution will be one inch.
The accuracy will be as good as you estimated the distances when you draw
The repeatability will be very bad, because the band stretches more or less,
depending on the force when using it.
*** Available now in NZ and AUS ***
Ahhhh... the old "precision vs. accuracy" thing...
Many years ago, I had a cheap vernier caliper that could measure to the
nearest .01 in. That's the precision part. But whoever made the stupid
thing obviously had no concept of the English system... If you measured a
1" piece, it would claim it was about .9 inches. That's the accuracy part.
When you say an instrument has, in your case, .00015 resolution (precision),
it means that the reading you get is some integer multiple of .00015. In
other words, the reading changes in .00015 steps. So, let's say you measure
something and it reads: .63075. An accuracy of .0005 means that what you
are measuring is actually somewhere between .63025 and .63125.
When a device has a high degree of resolution, it can often give you a false
sense of accuracy.
Thanks guys. That clears it up.
When I worked as a machinist in a shipyard I once measured 34 inch propellor
shafts at the 9 bearings. That was interesting. It took us 2 days to get
them all uncovered, then jacked up the shaft and removed the lower bearing
with a winch, then miked and re-assembled. It took the better part of a week
to prove that after 30 years service they were fine. The guy and mobile
machine that bored the new plastic "barrel" tailbushing after the seal was
Defender of Freedom, Advocate of Liberty
Polytechforum.com is a website by engineers for engineers. It is not affiliated with any of manufacturers or vendors discussed here.
All logos and trade names are the property of their respective owners.