I recall a thread sometime ago, that in normal usage, US decimal measurement was more "accurate" and lead to tighter tolerences under normal machining usage.
The subject came up on another group, and I couldnt find the thread on google so Ill ask here.
As I recall, the example was made of an engine being made by US and Euro manufactures (aircraft?) and the US engine was "tighter" than the Euro one, and as I recall, perhaps incorrectly, it was due to rounding or the smaller incriments normally used in US decimal system
Im prepared to be wrong on this one.
"Pax Americana is a philosophy. Hardly an empire. Making sure other people play nice and dont kill each other (and us) off in job lots is hardly empire building, particularly when you give them self determination under "play nice" rules.
Think of it as having your older brother knock the shit out of you for torturing the cat." Gunner
Gunner wrote in news: email@example.com:
I work in both metric and imperial (inch) on a regular basis and I would have to say there is no difference in the accuracy or repeatability that can be achieved.
I've heard similar to what you're saying and I can see where that assumption might come from. A 0-1" micrometer is probably the most used precision measuring device. With a conventional micrometer it is possible to read .0001" off of the thimble. Reading .001mm off of a conventional micrometer is not practical as the increment is too small. Most metric mikes that are the equivalent to a "tenth's" mike use .002mm as the smallest increment.
Even at .002mm it can be hard to read, especially on a skinny thimbled mike like a Starrett. In the end, if you really need to measure to .0001" or .002mm a conventional micrometer is not even close to being the best way to measure. But you can see where someone might say the .001mm (1 micron) is too small of an increment to be practical. They would be wrong, but you can see where the assumption comes from.
Nowadays the micrometer being used is likely to be digital anyway and will generally offer the same accuracy in either inch or metric mode.
In terms of tolerance, is there really any significant difference between a +.000/-.0003" and a plus nothing minus 8 micron tolerance? If the 15 millionth of an inch difference was significant enough the tolerance could always be specified as plus nothing minus 7.6 microns.
In a CNC shop environment there is no significant advantage of one system over the other in terms of accuracy.
As far as which system is better in general, there are trade-offs to each system. The imperial system has been around longer and evolved with civilization. So it's based on a human scale for the most part. The metric system was developed based on the need to have common transferable units in science. So the scale isn't always convenient but the system is uniform and simple. IOW, you can easily figure out how many microns are in a kilometer. Try figuring out how many ten thousandths of an inch are in a mile.
I like the the system of fits that is employed in the metric system. If you have a shaft that needs a certain fit into a hole there are simple standards for that in the metric system. On a drawing you simply give the dimension and the fit. For example 5mm h8 or 45mm h8. The actual tolerance is larger for the larger diameter as it should be. And it's all standardized.
The downside to metric is the lousy screw thread system it uses. More often than not the "perfect" solution doesn't exist, so a designer ends up making some sort of compromise, as opposed to the inch system which has far more options that are "standards"
As far as scale goes, when dealing with a small dimension, metric is easier as you are usually dealing with whole numbers rather than decimal amounts. Metric gets out of hand in a hurry as the parts become large. So neither system makes the math easier all the way around.
In terms of quality of product produced, be it a jet engine or a car, that has more to do with the abilities and discipline of the respective companies.
And of course the price. There are no free lunches.
I was arguing (well more of a discussion) with gunner who said that imperial measures were more accurate than metric, to which I replied it didn't matter as a standard of measurement is a standard of measurement no matter what you use.
However I think you need to use one or the other, and I think this is where he's getting his "Imperial tolerances are closer" bit from. You cant make a shaft in imperial measurement and a bearing in metric and still expect the tolerances to be the same as if both components were made using the same standard of measurement be it metric or imperial. I think this is what gunner was refering to with his rounding errors.
You're right about metric threads though, they're nothing more than a compromise. In imperial (well at least here in UK) we have dozens of different threads for different applications. Metric has well just metric really.
A further point on tolerances, some customers haven't a bleedin clue. I used to cut beams and tubing for steel structured buildings, now some of this pieces could be as much as 12metre long, and they wanted them cut to +/- 2mm. They'd be all bang on in the heated workshop, but by the time the customer received them they'd been sat on the back of a truck in the snow for a week and could be anything upto 1/2 an inch short due to themal contraction.
You better be. Tell me, what is this thing that you call "US decimal measurement"? By that do you mean using powers of 10 as a multiplier on U.S. customary units? Because if you do, we already have a name for that. It is called United States customary units( customary system, customary units or customary weights and measures), and decimal positional notation has been around for more then 2,000 years in Hindu numeral system. Apart from that U.S. measurement system and conversion factors involved are anything but decimal. I would really like to see how one system of units is more "accurate" then other w/o some grave error or idiocy being involved.
the system is uniform and simple. IOW, you can easily figure out how
33.3 centimeters. No calculator required.
Quick - you build a tank 1 yard by 1 yard by 1 yard. How many gallons does it hold?
Give me the distance between two points. You can't because I didn't tell you whether I wanted the answer in decimal inches, fractional inches, furlongs, miles (statute or nautical), hands, etc.
Give me the volume of an object. You can't because I didn't tell you whether I wanted the answer in cubic inches, cubic feet, gallons (standard or imperial), barrels (several standard sizes), kegs (also several standard sizes), bushels, etc.
Same with weight.
As soon as I get the flux capacitor fixed in my DeLorean I'm going back 200 years to beat the hell out of every member of Congress who says he is going to vote to keep us on the English system of measurements.
Ah, but try to find the .33 mark on your tape measure.
I do some recreational woodworking, and one thing that bugs me about metric is that it's difficult to break something into odd units. Sure, 10 is evenly divisible by 5 or 2, but a foot can be split up rather easily into 1/2, 1/3,
1/4, and 1/6 units. Again, as Dan said, it's part of the "human scale" that makes quick and dirty construction a bit easier.
In the 70s I was working on an early approach to flat panel TV. The US was in the throws of a big public push to go metric. We knew it would be years before this laboratory curiosity hit the market and, firmly convinced metric would kill off "English" by then, we decided to design it in metric from the get-go.
That lasted until the first drawings hit the machine shop. This was before most machine tools had CRC ability. It seems the thread pitch on the screws on the lathes/mills/whatnot were in English units - or some argument very much like that. We were told they could make what we wanted but it would be a good deal of math and extra time for them to get there. And these guys were darn sharp.
Could it be, in your case, the design and the machine tools matched, while the metric version was made with approximations ? Just my $.02 USD.
Hey! What's this "US decimal measurement"!? All the US did was to copy Imperial measurement units from the British Empire! Any of these units can be divided by ten or whatever you like. Granted, you've made small alterations to the size of the gallon and a few other things (was that to prove that you could, or just to cause trouble & confusion?)
I'm sure Gunner's enjoying winding us all up, but I'll rise to the bait anyway. Ian
Apologies to misc. survivalism (I won't ask what that is) and others, I was aiming only at rec.crafts metalworking. Please don't shoot me down for cross-posting, I didn't notice that I'd inherited the OP's inclusion of other groups in his search for the original thread. Regards Ian
Can't really comment on the differences between the two manufacturers in relation to metric imperial/measuring systems. All being equal, they should have both been working within a given set of tolerances, converted or not, and the product should have met those tolerances. If there were major discrepancies between the two shops, perhaps the tolerances were too loose in the first place, or perhaps one shop worked to the tolerance and the other held tighter than specified tolerances as a matter of course.
All else being considered, the metric normally used smallest typical unit used on the shop floor should be the hundredth of a mm, of which there are 2.54 to equal the one thousandth of an inch typycally used when measuring imperial.(1 inch officially being defined in the US as equal to 25.4mm exactly, IIRC).
If a typical measurement were to be given with a a plus or minus 1 unit tolerance, metric should come out closer to a given dimension than imperial.
This of course completely ignores the various protocols for tolerancing based on dimensions given to a certain number of decimal places and the like, as well as the opportunities for conversion errors.