Heat buildup in wiring

Hello:

Just wanted to get the general opinions on the following that was being discussed:

The statement:

30 amps will develop 2 1/4 times more heat than 20 amps in a conductor

The data to backup the claim:

Most of us know that developed power is the product of the voltage and current, but voltage is equal to the current multiplied by the resistance, Hence:

P=VI V=IR

Substituting: P=(IR)I P=I(squared)R

The resistance for the conductor hasn't changed but the current has changed from 20A to 30A. 30 is 20(1.5):

P=(1.5)(squared)R

1.5 (squared) = 2.25 P=2.25R

To me this is a wee bit simplistic, it doesn't take into consideration size of conductor, ambient temperature, length of run etc...

Reply to
Terry
Loading thread data ...

It is correct as to heat developed, (generated), by the higher current. If you want the temperature produced, it does depend on how the heat is dissipated from the conductor, which includes the factors you mention as well as others.

Reply to
VWWall

It may seem simplistic, but that's the answer.

Now what isn't so simple is just how hot the conductor gets when carrying 30A vs 20A. Maybe that's what you're thinking of?

With 2.25 times the heat generated, it should take 2.25 times the temperature difference between ambient if all other factors are the same. So if you had a 20C rise in conductor temperature over ambient at

20A, you should expect 45C rise with 30A.

But it actually is a bit worse than that because at 30A, with a higher temperature in the conductor, the resistance of the conductor is somewhat higher. So the power dissipated is slightly more than 2.25 that at 20A.

If the temperature rise at 20A is more because of insulation, conduit, embedded in a wall, etc..., than the rise for 30A will still be 2.25 times as much, but that can be too much. For example if the temperature rise with 20A was 40C, than carrying 30A would make the rise 90C. And that's probably too high for the equipment and insulation.

And of course if the ambient temperature is abnormally high, adding on the same rise (either 45C in the first case or 90C in the second case), it can still push the resulting temperature over some limits. That's why cabling is de-rated for high ambient temperature installations.

daestrom

Reply to
daestrom

Simplistic or not, this description is quite accurate. The biggest error will arise if R is not constant.The largest variation in R is likely to come from a change in temperature of the conductor.

Simple doees not mean wrong.

Bill

Reply to
Salmon Egg

Same effect with heating at contact and connection resistances too. That's why really good connections are essential with high current circuits. When heating does start, this often results in the quality of the connection changing for the worse, and the resistance rising. This generates more heat, and the two effects fuel each other, and you get a runaway, resulting in the connection grossley overheating and failing.

and also (in UK at least, and probably most places) derating for cable grouping - running multiple conductors togther where mutual heating effects result in a higher temperature rise.

Reply to
Andrew Gabriel

The assumption is that the same conductor is used at 30A as at 20A . You could do it, but in most jurisdictions, the codes take into account the changes in heating and dictate appropriate wire sizes.

Reply to
<dhky

Take a look at:

formatting link

Reply to
electrician

Descriptions of reality are simplistic--starting before Ohm's law. As you increase current through a conductor, the conductor's resistance USUALLY increases, Eventually, the conductor melts. Ask any fuse about what happens to Ohm's law when it blows?

Bill

Reply to
Salmon Egg

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.