Sounds like you have to turn into a Cal lab. Have two. Measure with one and see what the other says. If it matches - try the extremes (of the scale) and back to central region. If the TBD version isn't matching - slightly flex the pointer to see where it does - then the length of the next version is that of the first plus the air gap. Test Test Test.
Probably the best way is by trial measurement on something and comparing the reading with another indicator you know is accurate. Do it this way. Put any size lever on your Luftkin. Chuck up something round in a 4 jaw chuck on a lathe and deliberately dial it in so it has about 0.025 inch TIR with the known good indicator. Now measure with the Luftkin. Say you get 0.040 inch. The trial lever length is too short, as the indicated Luftkin reading is too big. By simple ratioing, the true length should be (0.040/0.025) x (trial lever length).
If the Luftkin reading is less than the true TIR you need to lengthen the lever. Lets say you get 0.015 reading on the Luftkin. By ratioing, the true length should be (0.025/0.015) x (trial lever length).
When doing the measurements, the more TIR on the test part the better as it will minimize reading erros, etc.
Trying to make a new lever from scratch would be a real challange, I think. The threads are DINKY on those levers You might want to consider calling Luftkin and asking if they can supply the correct size. Good luck, Gunner.
Have you tried a Lufkin catalog or web site? If it's anything like my B&S, the part number on the indicator (I hope there is one) should be specific to the length of stem it came with. Knowing that, you should be able to determine which one it needs. If I'm really lazy, I just dial
1-800-734-9099 and ask for Mike. I tell him what I've got, and he looks it up in his vast library of resources. Perhaps he can do the same for you. If he can find it, he can also probably get it for you, delivered right to your door. Visa and Mastercard gladly accepted.
Agreed. For comparison use it doesn't matter. I try to avoid taking readings from the DTI face if at all possible. Work to zero and take the reading from the height gauge/machine readout. If you must read from the dial keep the stylus arm as straight to the indicator body as you can to avoid cosine error. More angle=more error.
Don't you know that the best way to get lots of posts in answer to a question is to post a wrong answer? :-)
I didn't weigh in -- but only because I saw that several others had already followed up, and a quick check showed that they had already entered the correction. (One advantage of being late catching up on the newsgroup. :-)
Note that some makers design so the proper reading is given at a specific angle -- not necessarily 90 or 0 degrees to the work surface. An angle of 30 degrees will give a 2:1 motion modification -- and will hold the pivot point clear of the work surface (if you are dealing with a large radius), so some makers design for it to be correct at such an angle. All the more reason for you to chase up an old Lufkin catalog and find out both what length and what angle you need to use.
Or you could set it up on a stable surface, then slide a piece of shim stock under it to deflect it a known amount. This will show you how many marks it goes for say 0.001" of pointer motion -- and you can try different angles to find one which works well. (You could probably even find a combination of angles and pointer length to give your readings i Metric units, should you so desire.
Yeah, but not too often. So it's noteworthy. And just shows you that we all read your posts!
Anyway, I'm sorry I did write, because I had 367 incoming rcm's to read, and Gunners and your reply were early on, and I wrote then. I see after I got through more than about 50 of them that you did get quite a few "comments" on what happened.
Brian Lawson, Bothwell, Ontario. XXXXXXXXXXXXXXXXXXX
Hee hee. I joke about this, but it's true. You could win the nobel prize and nobody'd hear about it, but walk around for five minutes with my pants zipper down, and the pa system here cranks up with "now hear this!!"
================================================== please reply to: JRR(zero) at yktvmv (dot) vnet (dot) ibm (dot) com ==================================================
This is true. Interapid test indicators seem to be one brand that is most accurate when the contact point is at 12 degrees rather than parallel to the work surface.
An excerpt from the above site: =============================================== Test indicators can also be calibrated on a surface plate using certified gage blocks. The indicator is securely fastened to a stand and the contact point is brought in contact with a gage block of a given size. The contact point must be parallel with the surface of the block for most manufacturers. Interapid test indicators are an exception and should be at a 12-degree angle, approximately. ================================================
Here is a cosine error chart from the same site with a correction factor to be used for various indicator tip angles. (If it doesn't copy correctly just go to the site)
================================================ About the cosine error: for test indicators excluding Interapid models. If the contact point can not be kept parallel to the work surface then you will have to make a mathematical adjustment to the dial reading.
contact correction point angle factor
60° 0.50 From this chart you will notice that a contact point held at a
60-degree angle results in one-half the dial reading. Once you determine the angle, simply multiply the dial reading by the corresponding correction factor. For example, an indicator reading of .0085" at an angle of 30-degrees is equivalent to .0085" x .87 = .0074" ===============================================
Some indicators used to be made with involute "balls" (sort of pear shaped tips) that automatically corrected for minor cosine error. But I haven't seen any in years.
Using shim stock to try and calibrate your indicator is probably not such a hot idea. Especially with a .0001 indicator. Shim stock not only varies in thickness but can be bent, or burred. That's what Jo blocks and Height Masters are for. If you don't have a Height Master or Jo blocks then in a pinch you could use pin gages. Mic them first to see what they actually measure.
O.K. I knew that I had encountered at least one such brand.
[ ... ]
[ ... ]
That is simply the cosine of the angle. Something which can be looked up in tables, or with a scientific calculator, had at the press of a button -- to finer steps than are given above. (Though getting that close a measurement of the angle of the tip is a bit problematical. :-)
[ ... ]
I suspect that they wore faster than the spherical ones, and were more expensive to replace.
But -- shim stocks (or feeler gauges) might be present in a shop which doesn't have a set of Jo blocks (or equivalent), though a cheap Chinese set would be good enough for the purpose.
Ideally, the shim or feeler gauge would be placed between a ball contact and the ball on the indicator, to eliminate the problem with bent or burred examples.
And all things considered (including the cosine errors), you really don't want to be depending on such tools for truly accurate measurements, anyway. A plunger style would be better, as it has no cosine error if set up square to the surface being measured.
Shim stock would be good enough to get an order-of-magnitude indication of the sensitivity of the indicator -- for such things as telling whether you have the right length feeler arm on the indicator.
Again -- I doubt that you would be using that style of indicator for precision measurements -- unless the access to the point to be measured precludes anything else.
I would agree to the use of Jo blocks for checking a plunger style indicator -- especially a tenths-reading one (or more sensitive, if you are really lucky.)
Sorry to burst your gunnerinios, Gunner; but Tom's not correct. The point length does matter. What an indicator actually measures is angles. The angular deflection of the pointer touching the work is magnified by the gears into a larger angle when the needle moves around the dial. The linear distance (tenths, thousandths, or whatever) that you read is just a convenient conversion from degrees, which is accomplished by making the marks on the face of the indicator match the angles produced by the pointer and gears.
Here's a real simple example: Think of an indicator on a surface gauge, with it's pointer arranged to be perfectly horizontal, while touching the top of a stack of gauge blocks. The pointer is exactly 1" long, from the point where it's ball touches the gauge blocks to the center of little axle it rotates on. To move the pointer one degree, you'd need to change the stack of gauge blocks by 17.5 thousandths.
0.017452 inches, actually. That's the sine of one degree. If the gears in the indicator multiply the angle by, say, a factor of ten, then 17.5 thousandths on the gauge blocks, which produces one degree of motion on the pointer, will become ten degrees of needle motion on the face of the indicator. In order for the indicator to work properly, the marks on it's face have to be arranged so that 10 degrees equals 17.5 thousandths when you read it. Or, the .001" marks on the indicator need to be exactly .5714 degrees apart.
If this same indicator had a 1/2 inch pointer, and you changed the gauge blocks by 17.5 thousands, that would create an angular motion in the pointer of 2.006 degrees, instead of one degree. When the gears multiplied it by ten, this would become 20.06 degrees of needle motion on the indicator face. If the marks on the face are still .5741 degrees apart, then 20.06 degrees would look like 34.8 thousandths, even though the gauge blocks had only changed by 17.5. Bad news, of course.
For some kinds of measurements, you don't really care about any of the marks on the indicator face except the zero. Concentricity can be one of these. Whether the indicator reads accurately as it moves isn't really an issue. All you care about is that it repeats to one particular position. In that case, pointer length doesn't matter as much; but it still affects (multiplies or divides) the sensitivity of the indicator. Unless your readings really ARE zero, you won't be able to tell precisely what they mean. With too short a pointer, the needle will move too much in response to even the slightest provocation. If the pointer's too long, the needle won't move enough to tell you whether you're really as close to zero as you want to be.
The other factor, which makes all of this a little less perfect than it might appear, is cosine error. As I'm sure most folks here are aware, it's often pretty tough to set up your indicator as described above, with the pointer exactly parallel to the surface plate, or tangent to the diameter you're measuring, or whatever. That affects the accuracy of measurement substantially. And, the affect grows trigonometrically as you try to read larger and larger values on the indicator dial. It might not mean much when you're looking at a difference on the dial between zero and .001; but it can mean a lot when the needle needs to move from zero to .010 or beyond.
You can calibrate your indicator, no matter what the pointer length is, as long as you plan to use it only for concentricity, or for measurements that involve only small movements of the needle.
Mount the indicator on a surface gauge. Set it up so the pointer is as close to horizontal as you can get it, while zeroed on a stack of good gauge blocks. Then carefully increase and decrease the height of the gauge blocks by increments. Say, plus a thousanth, then minus a thousandth. Then plus two, then minus two, etc. Write down the results of each measurement, including the gauge block changes AND the actual needle readings you get from the indicator dial. These will produce a kind of "conversion table" for the indicator with any particular pointer. If each .001 change on the blocks gets you a .0015 change on the indicator dial, for example, then you can use the indicator accurately (over small ranges of motion) just by converting with a factor of 1.5.