I am new to this group. Just wanted to ask a stupid question :) Some one once tolled me that it would be cheaper, in terms of monthly electrical bill, if I was to run the same appliance at 240V instead of 120V. For instance if I have a same 100W light running from a 240V (2 phase) source it would be cheaper and my meter would count less then if I was to run the light from 120V. Does this sound true? I live in United States, BTW. I do not know if that makes a difference in terms of voltages.
I am going to run a dedicated power line for my servers. They can run on 240V or 120V, the power supply does not care. So I was wondering if it would be cheaper if I was to run a 240V line.
Your load must be rated for the voltage applied. If the light bulb is rated for 110 and you put 220 to it, it will draw twice the current its rated for and burn out...
If you have a dual voltage device and wire it for the higher voltage it will draw half the amperage that it will when wired for the lower voltage and with the lower voltage applied...but the wattage will be the same (volts x amps).. thats all in the way the wires in the motor or transformer are sized, wound and arranged. its not magic.
the notions present oposite scenarios however...a light bulb rated for 110 run on 220 will draw more amperage than 110 not less as with a motor run on variable voltages... thats because the motor is an 'impedance' load and the light bulb is a resistance load... it has no impedance..
you may want to sort those terms out on google.
there are other issues beyond the scope of this remark related to power factor correction.. it does not concern most smaller applications but is an issue.
If you are asking about what the meter reads, no difference.
There is a small savings from lower resistance loss when using the higher voltage, because the current is less. Let's say you have 1800 watts of servers. At 120V they draw 15 amps. If they are on the end of a 50 foot run of #12 wire (100' round trip), the wire resistance is .187 ohms. I squared R means there is about 42 watts of heating the wire. If you use 240V they'll draw 7.5A, so the wire heating is 10.5 watts, or a savings of 31.5 watts. (if you try to save a little money by using #14 on the 240V circuit, the heating is now 16.7 watts, still a 25W saving) At 10 cents/kWh, and the servers running 24/7, that 31.5 watts savings saves $2.23 per month. (the servers themselves will cost almost $130 per month) A double pole breaker (for 240V) is also a little more expensive than a single pole one.
You slipped your disco on this one. in this case 1/2 the current. voltage up amps down.
Never mind the neutral issue. Not like you to make a mistake like this.
Running the servers at the highest voltage available will usually mean smaller wire, depending on the distance and the load. The wire may be cheaper, but the plugs cords, and everything else may bite you in the ass. If this was the only 240v electronic appliance in the home I would be hesitant to install 240v. A several dedicated 120v circuits would be a lot easier to explain to the real estate folks when you sell.
You aint gonna save a nickel on the power bill either way.
The power line at 240V will have to carry only half of the current a
120V line would need. This means you might be able to use a smaller gage cheaper wire. In practice, such circuits tend to be either 15A,(14 AWG), or 20A,(12 AWG). We'd need to know the physical details of the installation, future expansion plans, and present circuit availability to price out the two feeds. Labor costs would probably outweigh the cost of materials either way. Some server power supplies are somewhat more efficient running on 240 V, and this might result is a slight power saving.
Using a higher voltage means using less current for the same wattage at the load. This can result in some savings in heat loss in conductors. Of course, the loads would have to be replaced or rewired for the higher voltage. The savings probably are not significant for an existing installation after considering the total cost of renovation. Fortunately though, NM(B) cable is rated at 600 volts and can be used at 120 or 240 volts. There is an interesting article on how much can be saved by increasing the size of conductors put out by the copper organization. This same concept could be applied to increasing the voltage to lower the amperes. The Article is of interest. It is titled One Size Up Means Big Savings and is at:
REF: "Installing wire only one size larger than has been required by the National Electrical Code increases energy efficiency with dramatic paybacks. This simple technique can yield quick paybacks while increasing the flexibility of the installation. By increasing the wire size, reduced power losses offset the cost of the wire and produce savings on energy costs."
I have slipped disco's before, especially with red heads who drink... then run around on me.. the last... brought one of the indy 500 racers home one night.. that was Kimbra... whatta babe. Drove me nutz that woman...had her victoria secrit undies hanging to dry all over the house... the entire place smelled like .. well you know.
lets run the math and see
setting the resistance at 8 (an arbitrary depends on the size of the resitance load as you know) and the voltage at 110 v for one run, and 220 v for the other run, we see that amps with 110v is 13.7 and at 220 its double the amps at 27.... ?! maybe he is wrong.
Ohms law... mashed via the watts formula... watts = volts x amps
You held the watts constant .... by assumption.... so on that basis when volts goes up, amps would have to go down according to ohms law... as you knew.
but with the over cooked light bulb, high voltage applied beyond the design range... when the wattage goes up (as I mentioned intuitively somewhere else in this screed).. more electrons are shoved across the element...amperage goes up.
A 110 v rated bulb with 220 volts applied glows twice as bright,,,, the wattage doubles... it wont last though.
I am not entirely clear on this issue myself. If wattage only doubled, then the nice mans formula would show amperage the same on the same 8 watts resistance with either voltage.. watts being volts x amps and all and the wattage simply doubled. But he shows both the wattage and amperage doubled.
Someone with more time will parse this for us Im sure.
this guy shows program the amperage is up also... double actually...which has been my observation over the years and intuitively for most resistance loads (as opposed to transformer or motor loads).. the higher voltage pushes more electrons across the light bulb element... so you will get more wattage from the bulb at the very least... but *double the amps? that surprised me... I can see that you would get more amperage. How he can compute double eludes me, it seems that some resistances would begin to choke the flow in a non linear fashion as one exceeded their current carrying capacity...and there are frequency issues not considered in that set of formula's.
I dont have time to take a close look just now, but I will one of these days.
In motors though.... with the option of wiring them for either voltage we get the result you mention, half the amps...why? Because the wattage of the motor must remain the same or the little sucker will burn up... so its wound at the factory with either option.
In the case of our light bulb with 8 ohm element designed to run on 110v it will work great on 110v...but because of more force applied, and the element not configured to double length with higher voltage applied... more electrons (amperes) will be forced through the skinny little element... toasting it...hurting yer eyes...and running GE light bulb sales up.
What creates this common confusion is not just the motor load vs straight resistance load issue.... but the fact of ohms law often taken out of context with Watts = volts x amps...combined in formulation with ohms law.
You were in effect asserting that the watts would remain constant... and of course with the added voltage watts cannot remain constant according to this formula.
the glitch was parsed for us by the nice man here who gave us this easy web site that calculates all *4 from any two given. .. combined. Not just either one or the other formula's Ohms and the watts formula.
I addressed the issue intuitively though... more force on the light bulb element (higher voltage) would shove more electrons across it..(more amps).
Why doesnt this happen in a motor necessarily,,,, or to the same degree... impedence, one of several different aspects of 'back emf'.. there are others... the energized motor winding spinning a rotor that then imparts a counter flow back into the winding...that 'back EMF' calc is not shown solely by ohms law...it is in the other aspects of motor design... magnetic flux formula's etc... all of which we think we understand but in actuality we have no clue...we are still trying to fathom Maxwells formula's on that range of issues from 1868...
Junk food, and chemical farming has destroyed our brains and we have ceased to advance apparently.. we have no unified field theory on any of that to date...we operate only on the empirically derived information we have... workable at lower levels but not complete...
We... you and I... are the drooling cave men of the year
One size up is not practical for 20A circuits, which are required more often now. Real savings might be had by going to "loop" wiring as used in Great Britain.
Long ago, I signed off on plans for an electrical contractor. Most of his work was residential, and he used 12 gage wire even for 15A circuits. He claimed that not having to carry both 12 and 14 gage wire off-set the extra cost of the larger wire. Copper prices soon went high enough that he had to go back to using both sizes.
I think No. 12 on 20 ampere 120 volt circuits will be around for a long time. If a practical super conductor is ever invented and if practical nuclear fusion is ever accomplished we should see a major change in the industry. But until then, I think we are stuck with the same old copper used in the same old configuration as has been the case for the last 70 years or more. Sizing up is not practiced nor is increasing the voltage from 120 volts to 240 volts. Also, 15 ampere circuits are rare in most residential wiring that I have seen and done.
Thank you guys for the help. I was just thinking that there is some magic loophole in the meter that would allow me to save money by running
240V. I realize that regardless of volts I will still run about equal amount of wats. Even with less resistance it's really not worth it, especially because I would need to run a second 120v line for the 120v stuff. In the end it's ok though, I'll just run the 120V dedicated line.
Was wondering because I actually run about 800W in servers 24/7 and it adds up at the end of the month.
a few years ago some 'arc flash' links were posted to the NG... having to do with the utility companies practice of doubling up on transformers as area loads increased...putting greater arc flash potential at the user service location... and discussing at length the relative differences in hazards close to the panel, as compared to on a distant fixture wired with 12 or 14 wire.. there is apparently new code now how service panels are to be marked and personnel allowed access on larger commercial installations.
These articles discussed the wide range of factors involved in circuit interrupt devices, the time delays involved and the exponential increase from a dead short as interrupt times increased... all that to say that the fuse and interrupt protection was very limited in many scenario's as you probably know.
Accordingly those articles (professional journals i think) recommended minimizing wire sizing, especially close to the service entry. Would that help with 00 wire? Not enough to fry you any slower. but on 110 recepticles close to the service, say 2' away on size 10 wire, the arc flash potential would be significant compared to 100' away on 12 gage wire.