Why specifically 4-20 mA?

Why on earth do we sepcifically use 4-20mA, I mean they could have used any range they wanted e.g 3-15mA or 8-16mA. I know it's a standard and I know 20mA isn't enough energy to create a spark to ignite Hydrogen, but then like in my example niether would

15mA or 16mA. And if the 4mA is for a live zero then why not 3mA or 8mA. Any help anyone can give would be great, Thanks
Reply to
CoRpSe877
Loading thread data ...

CoRpSe877 schrieb:

Hello,

at first there was 0-20 mA, 100 mA is too much, 10 mA is to small, therefore the round value 20 was chosen. For live zero, they chose 20 % or 1/5 of 20 mA.

Bye

Reply to
Uwe Hercksen

And what is the advantage of 4-20mA over 0 -10V

Reply to
Bob Watkinson

Think of the effect of connection cable resistance.

As long as the driving voltage can go high enough, a current output sensor will still give an accurate output, irrespective of the length and hence the resistance of the connecting wires. The same current flows through the load, irrespective of the voltage dropped in the wires.

A voltage output sensor would suffer voltage drops down the line, and those drops would affect the output and vary depending on line length and resistance.

There are other reasons why one may be preferred over the other, to do with source and load impedances, noise and transient response and other factors.

Reply to
Palindr☻me

Really good explanation. And..:

Line lenght can be a few 100 meters, so voltage drops can be really big, vith lots of noise.

4mA is used for 0 so You can detect a wire or instrument failure.
Reply to
Mladen

1) Noise immunity. A current loop is low impedance, and it is therefore very resistant to electrical noise. Excellent for long runs in industrial environments. 2) Open circuit detection. Anything less than 4 mA is an illegal condition, and 0 means there is an open loop.

Ben Miller

Reply to
Ben Miller

No one is answering the OP's question. Instead answers are simply defining advantages of current loop - which was not the question.

Reply to
w_tom

I've been told that at the time this standard originated, 4 mA was determined to be enough to power the instruments they were wanting to power from the loop.

Reply to
JC

Good stuff. Another reason is that some transmitters can be made to be self-powered from the 4-20 ma. A power supply in the instrument rack energizes the circuit, and the remote transmitter has a minimum resistance to create a voltage drop. The voltage drop is used to power the entire transmitter.

So the remote transmitter needs only the two current-loop wires, no other power source.

daestrom

Reply to
daestrom

why did Farenhieght define his scale for temperature like he did? What does it matter? Many devices use the 2ma, 3ma, 21ma, ect for diagnositic purposes. I'm not sure of the origin though. A lot of old scales like that are defined just arbitrarily. hap

Reply to
hapmyster

He set 100 to his body temperature. He set zero to the coldest solution he could get using ice and salt.

Of course, he was off. But that is how technology begins. The first transistor was not the marvel you have access to now.

Al

Reply to
Al

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.