I have a onemili ohm current shunt rated for 50A and I have two ways of measuring its output.
I could use a 100 ohm resistance µA meter to yield 10µA/A reading or a 10Mohm input voltmeter to yield 1mV/1A reading.
In theory, either works well, but which is the preferred method for a real life application if it matters at all?
ammeter specifications:
500µA F.S. 0.01µA resolution (1mA resolution)
0.25% + 20 dgt
Voltmeter specifications:
50mV F.S. 1µV (1mA resolution, but LSD is too jumpy to be of real use)
0.1% + 20 dgt
---------- Either will work. B Foelsch suggests the voltmeter. Assuming resistances are exact (not true) Ammeter At 50A, current in shunt =49.9995 A current in meter =499.995 microamp Error is negligable Voltmeter At 50A current in shunt =50.0 as near as damn current in meter 0.005 microamps. near as damn B Foelsch is right but on the other hand, if you want a continuously available reading, you could use the ammeter and save the voltmeter for moving around taking other measurements. The error is negligable. Take your pick
Phats is wrong. Chances are that your ammeter is made as a 50mvFS voltmeter (say several megohms) with an internal shunt of about 100 ohms.