I'm a system engineer responsible for specifying power quantities for
my systems. There are a lot of ways that are accepted in my industry
to do this that I know to be incorrect, so I'd like to hear from
someone with more specific expertise.

The systems are comprised of a combination of 120V resistive loads, and 208V resistive and inductive loads, although the devices that have inductive loads actually operate at various voltages, their power supplies accept 208VAC. Most of the power available is 208/120VAC 60hz, usually in 400A increments.

What a lot of people do is add up the load P, divide by operating voltage and divide by 3, so if we had 10 500w loads at 120V and 5 at 208V, it would look something like

((5000/110)+(2500/203))/3 = 20A needed.

I can't imagine that's correct. What I have done based on advice and reading is divide the load P by 1.73, a relevant power factor and the supply voltage, so this:

(5000+2500)/(1.73

But I can imagine that being wrong, also. What is really the way to do this?

Thanks in advance.

The systems are comprised of a combination of 120V resistive loads, and 208V resistive and inductive loads, although the devices that have inductive loads actually operate at various voltages, their power supplies accept 208VAC. Most of the power available is 208/120VAC 60hz, usually in 400A increments.

What a lot of people do is add up the load P, divide by operating voltage and divide by 3, so if we had 10 500w loads at 120V and 5 at 208V, it would look something like

((5000/110)+(2500/203))/3 = 20A needed.

I can't imagine that's correct. What I have done based on advice and reading is divide the load P by 1.73, a relevant power factor and the supply voltage, so this:

(5000+2500)/(1.73

***0.9***203) = 24A neededBut I can imagine that being wrong, also. What is really the way to do this?

Thanks in advance.