What Happened to 220?

_______________________________________________________________________________ Posted Via Uncensored-News.Com - Accounts Starting At $6.95 -

formatting link
The Worlds Uncensored News Source

Reply to
Opinion Seeker
Loading thread data ...

Sorry 'bout that electrifying mistake. Here is the post again: Hello, Back in the old daz there was only 110 (or was it 120?) and 220 for all practical purposes (in the home). Now there is this new-to-me 240. What is this 240 and what happened to 220 (for that matter what happened to 110)? I am purchasing some electric baseboard heaters that are "220", so what happens now. Do i have the house wired for 220 for the heaters or will 240 do it? Here are the heaters. Awful, slow website HYDRO-SIL:

formatting link

_______________________________________________________________________________ Posted Via Uncensored-News.Com - Accounts Starting At $6.95 -

formatting link
The Worlds Uncensored News Source

Reply to
Opinion Seeker

Umm, "220" went bye-bye about 50 or more years ago when the nominal voltage supplied to homes was raised from 110/220V to 120/240V, possibly in steps (I've seen 115V and 117V listed as the standard voltage). For some odd reason, nearly everyone, including professional electricians refers to the 240V two hot supply as "220". 240V is what you should see on the spec. plate of every non-ancient "220" device you see. (you might see 208V/240V rarely, also OK, though 208V only devices may be overstressed on 240V or won't work at all because they require 3 phase)

Reply to
Michael Moroney

The original 110 V was Thomas Edison's best voltage for safety and the operational characteristics (resistance) of the carbon filament lamps at the time. The voltage also had to be high enough to overcome the effect of voltage drop from the DC central stations extensively used back then and since there were no transformers, the lamps could be operated only within a mile or two of the central station without excessive dimming. Initially, Edision only believed in DC until Westinghouse made him see the light (literally). Only the AC systems allowed transmission and conversion over extremely long distances.

As modern AC systems evolved, the higher voltage (120 V) was specified as a standard as it allowed more efficiency (less voltage drop) and allowed more power to be transferrred without too much of a compromise with safety.

During the 1930's, the US 110/220 V system was developed and promoted by the U.S. Rural Electrification Agency (a unit of the Department of Agriculture) as the best way to electrify rural America and became the North American Standard.

European systems at the dawn of the 20th century were dominated the standards set in Germany where the great manufactuers (Seimens, etc.) set the pace and it was decided that 220 V. at 50 Hz would be the way to go. The higher voltage meant that systems could be built with less copper (copper was a critical wartime commodity and for the most part, Germany was either fighting or preparing for the next war), smaller transformers (less iron required at 50 Hz than 60 Hz), and, greater efficiency with a corresponding greater shock hazard. A shock from a 220 V. circuit hurts a lot more than one from a 110 V. circuit, although it is easy for either one to kill you.

Beachcomber

Reply to
Beachcomber

I thought it was the reverse of that. Higher frequency gives a lower V.s integral therefore less flux and less iron required. I think 400Hz systems are used in some applications to get size and weight down.

j
Reply to
operator jay

You're correct. Many US appliances are designed to work at 50/60 Hz, including most military equipment. This requires a slightly larger transformer, (iron), core, but with proper windings it makes appliances that can be used in Britain and Europe. I was in Italy during WWII, and the frequency sometimes went as low as 42 Hz! We would have to switch to diesel generators.

400Hz has long been used in aircraft, because of the much lower transformer weight. Much WWII military surplus was from equipment designed for 400Hz.

The standard ATX computer power supply doesn't use a line transformer, so it will work at any frequency.

Reply to
VWWall

Inflation has raised it to 240.

Bill

Reply to
<salmonegg

That's certainly not the reason in the UK. The initial local generating stations in cities were normally 100V (or slightly higher to allow

100V to be obtained at the ends of the local cable runs). Most city centres passed bylaws forbidding overhead cables, so the cables were buried under the streets. As the power required rapidly increased, it wasn't easy to string thicker cables as was done in the US, so instead the voltage was doubled.
Reply to
Andrew Gabriel

You are right.

Reply to
Don Kelly

Really? Try running one at 25 Hz. The input filter caps will not remove enough ripple for proper operation at the rated output.

Reply to
Michael A. Terrell

First sorry for my english, I'm french...

The voltage was encrease from 110V to 120V to have better performance i.e. have a lower current for the same power. If the current decrease, the wires drop voltage is lower and the load have more power available.

Now, your hold wires are able to support 120V or 240V because the minimum isolation wire is 300V. Also like I wrote before, the fact that the voltage increase => a decreasement of current. The wire size (dimeter) is establish with the current passing trough the wire. If the current wire decrease, your hold wire could be more secure than a new one calculate with the present current...

One important thing is: do not have aluminum wires because they could unscrew (slack) with time and brings many problems.

Regards

Steve

Reply to
Steve & Julie

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.