Cost of electricity for light dimmer

I have an ordinary room lamp. In the mains lead of the lamp there is a dimmer device. It is continuously variable from very dim to full strength.

If I set the dimmer to give me a dim light then do I pay less for the electricty than if I used the light bulb at full strength?

Thank for any info. Z.T.

--------

I don't know if it makes any difference but I am in the UK.

Reply to
Zarbol Tsar
Loading thread data ...

Yes, but you pay more than you would by using a lamp of lower wattage.

For example with a 100 watt bulb, and the dimmer set to deliver 50 watts, the lamp may give out approximately the lumens equivalent to a 25 watt bulb.

Reply to
someone

Do you mean that if the dimmer is set at its midpoint to 50 Watts (insted of the full power of 100 Watts) then I pay only for 50 Watts consumption?

Reply to
Zarbol Tsar

That is true if your dimmer is truly linear. I imagine they really start a bit above zero since they need enough power to give some light out of the filiment. You are saving money with a dimmer light.

Reply to
Greg

answer depends on the dimmer. Some dimmers just burn off the excess voltage as heat. Ever notice some of commercial dimmers and the huge heat sinks on them.

I will bet that if you try an 100 watt lamp and measure it at 100% and then again at 50% your not going to see much of a difference in the usage. (ahead of the dimmer)

I once had a museum customer that had huge amounts of track lights on dimmers. They would put in 100-150 watt bulbs and then dim to around 50%. Dimmers toasted every 6 months. The load was only about 80% of the rating of the dimmer. I convinced them to change to 75 watt bulbs and dim 25%. I have not replaced an dimmer in 5 years. I have not idea if there was just an bad batch of dimmers or what. Under constant use as far as I am concerned dimmers are not my friend. I use levels of light which means more fixtures in my home. Just my view from the cheap seats

Reply to
SQLit

Yes, your consumption goes down so your bill goes down (or should). Unfortunately your light output goes down faster than your bill, because light bulbs can't be efficiently dimmed.

Reply to
Tim Wescott

I don't know where the point would be to get the 50 watts, but the point is that you would be paying for power that would be consumed by a 50 watt bulb but getting only the light you would get from a 25 watt bulb. From the money saving point of view you are ahead to put in a 25 watt bulb (and pay for only 25 watts of power).

A dimmer is for aesthetic purposes. One little advantage is that the bulbs usually last longer when run at low power. Savings there still is tiny compared to the loss of lumens per watt.

Reply to
someone

You're wrong both-sorry but that's it.A dimmer is *not* a potentiometer, so there's no energy consumed on it.Usually it has a triac (which is a simple power electronic device, like two thyristors in anti-parallel connection).

-- Dimitris Tzortzakakis,Iraklion Crete,Greece major in electrical engineering-freelance electrician FH von Iraklion-Kreta, freiberuflicher Elektriker dimtzort AT otenet DOT gr Ï "someone" Ýãñáøå óôï ìÞíõìá news:u4yfd.776372$ snipped-for-privacy@bgtnsc04-news.ops.worldnet.att.net...

Reply to
Tzortzakakis Dimitrios

The heat sink is for the triac.

-- Dimitris Tzortzakakis,Iraklion Crete,Greece major in electrical engineering-freelance electrician FH von Iraklion-Kreta, freiberuflicher Elektriker dimtzort AT otenet DOT gr Ï "SQLit" Ýãñáøå óôï ìÞíõìá news:o3xfd.170670$a85.30219@fed1read04...

Reply to
Tzortzakakis Dimitrios

Here is my take on dimmers.

1) Dimmers do NOT burn off power to accomplish dimming. At least not the dimmers that we use in residential switching applications, and not the dimmers that have been available for 30 years or so. 2) There is a very small power loss in the dimmer, but that is NOT what causes the dimming. Rather the current to the lights is switched on and off, causing the average (effective) current to be lowered and therefore the incandescent bulbs are "dimmed." The small size of the dimmer in the wall box is what causes the "hotness" of the dimmers. And the higher wattage dimmers have the larger visible heat sinks because they can carry higher current and therefore dissipate more (but still small) power. 3) Bulbs should be chosen to allow the level of lighting that would be the MAXIMUM desired at a location (or the maximum fixture rating), and then dimmers can be used to allow LOWER levels of lighting when that is desired. 3a) A more efficient way to accomplish dimming is with switches. For example, a three bulb fluorescent fixture works nicely with two wall switches to allow one bulb on, two bulbs on, or three bulbs on. The bulbs run at maximum efficiency whenever they are on. BUT this requires an extra wiring and an extra switch, plus it does not allow continuous dimming. (For fluorescent lighting, dimming is more difficult, so this method is especially nice.) 4) If the bulb wattage is ALWAYS higher than the desired level of lighting, then a lower bulb wattage should be used since it gives more light per watt than a dimmed higher wattage bulb. For example, using a 60 watt bulb to get a certain level of LIGHT is more efficient than using a dimmed 100 watt bull to get that same level of LIGHT. Look at the lumens ratings on the two bulb sizes to understand this. Two 60 watt bulbs equals the same LIGHT output (lumens) as one 100 watt bulb. 5) Dimmed bulbs will last longer since their operating temperature is lower, also: 6) Using a ROTARY (or slide) on/off dimmer (which brings the light level up from zero) is better for the life of the bulbs, since it minimizes start-up surge current. I know there has been difference of opinion on this, but I am convinced that my opinion is fact. Do not buy the PUSH on/off dimmers; they will shorten bulb life, and may shorten dimmer life, also (see #8). 7) Dimmers go bad when the loads they are controlling is higher than their ratings. Also, putting more than one dimmer in a box causes a derating of its power capability. For example, the standard 600 watt dimmer (with no extra heat sink) gets a smaller rating when more than one of them are in a ganged box. 8) Dimmers also go bad when a bulb filament sags and causes a high current surge when it burns out. This happens more often with the push on/off dimmers since the push on/off dimmers cause a surge of current which mechanically breaks an old filament. Ever heard the "singing" of a bulb on a dimmer circuit? This is a similar mechanical vibration of the bulb filament. 9) A single standard 600 watt dimmer should last a long time on 400 watts or so. I know lots of examples of this.

OK, my time is up. What have I forgotten? --Phil

Reply to
Phil Munro

Not for the past 40 years they have not - they would be too expensive to produce.

Ever notice some of commercial dimmers and the huge heat sinks on

I am sure that there will be plenty of respondents explaining the purpose of these (relatively small) heatsinks. Have you no idea of the size of heatsink required to 'burn off', say, 100 watts in air and keep the surface temperature of the heatsink below the hundreds of degrees!!

Reply to
R.Lewis

Were you replying to a different post than those listed below?

The illumination output of a standard light bulb varies approximately by the square of the applied power. Cut the power in half and you get about one fourth the lumens. That is only approximate and varies somewhat with the type of filament.

Reply to
someone

This in particular is why you shouldn't use dimmers with lamps that are mounted so that the filament is above the socket. When such lamps burn out it is not uncommon for a broken piece of filament to fall and briefly short across the two thick wires that feed the ends of the filament.

Sylvia.

Reply to
Sylvia Else

You pay less with a given lightbulb being dimmed than with the same lightbulb not being dimmed.

However, if you use a lower wattage lightbulb or fewer lightbulbs you save even more. Incandescent lightbulbs are very significantly less efficient at producing light when dimmed.

- Don Klipstein ( snipped-for-privacy@misty.com,

formatting link

Reply to
Don Klipstein

When dimmed to half the power consumption, you pay for 50 watts while getting less light than a 40 watt lightbulb delivers. This gives you about 21% of the light of non-dimmed operation, but due to some economies of scale this is brighter than a 25 watt lightbulb but still dimmer than a 40 watt one. (It takes about seven 25 watt lightbulbs to make as much light as a 100 watt one.) As for the midpoint of the dimmer - it is probably other than this.

Maybe better off with compact fluorescents - get full light with around

25-33% of full power, and equivalent incandescent dimmed to that low a power consumption is around or just somewhat more than a nightlight.

Splurge a little and you can get a system with dimmable compact fluorescents. This requires bulbs, fixtures, special dimming ballasts, and dimmer controls to be compatible with each other - get the system as a whole. Disadvantages: 1) The lower limit is probably "brightish nightlight" 2) The color does not change to a warmer color the way incandescent does when dimmed. Color changes are much less and more irregular.

- Don Klipstein ( snipped-for-privacy@misty.com)

Reply to
Don Klipstein

People fear the resistive losses in potentiometer (or rheostat) type dimmers that have been obsoleted by triac-based ones for decades.

However, a remaining issue is that incandescent lamps operate much less efficiently when dimmed. As a rough general rule, efficiency of a given lightbulb at producing visible light varies roughly with the square of power fed into it. (Roughly, only roughly that is.)

- Don Klipstein ( snipped-for-privacy@misty.com)

Reply to
Don Klipstein

And, as an additional issue, it's generally a bad idea to use a dimmer with quartz-halogen lamps. When dimmed, the bulbs run a good deal cooler, the halogen-sequestering-and-redeposition of the tungsten doesn't work as well, the tungsten tends to plate out on the inside of the tube and dim the bulb, and the bulb lifetime is greatly decreased.

All in all, as others have said, it makes more economic sense to use smaller bulbs at full power rather than dimming a high-wattage bulb.

Reply to
Dave Platt

In article , Phil Munro wrote in part:

True, but this argues against dimming.

Better would be to point out that a 100 watt bulb dimmed to the brightness of a 60 watt one consumes about 73-74 watts. And that a 100 watt bulb dimmed to consume 60 watts produces about 21% of its full output, which is less than that of a 40 watt lightbulb.

Now for the bonus extra trickery: Using fewer lightbulbs is better than using lower wattage ones, as long as you get adequate light distribution. Higher wattage lightbulbs are slightly more efficient. There is more than one reason, but one is that thicker filaments can be operated at a higher temperature (better for radiating visible light as opposed to infrared) for a given life expectancy.

True, but for among different ways of achieving a given light output lower wattage (or better still fewer) undimmed bulbs will cost less than dimmed ones. In most of the USA, the cost of the electricity is so much more than the cost of buying replacement bulbs that it pays to consider energy efficiency.

A few bulbs benefit from "soft starting", many and probably most do not. Most (but not all) bulbs have zero or negligible fatigue damage to the filament from a "cold start" despite a cold start jolting the filament to cause a "ping" sound that is audible at close range. It is true that most incandescents fail at a cold start. However, for most models, the actual damage is caused mainly by operating hours. One thing that is true (for most lightbulbs) is that an aging filament becomes unable to survive a cold start just a little before it becomes unable to survive continuous operation.

There is a usual prelude-to-failure uneven evaporation of the filament. That process causes a "thin spot" that is subject to a temperature overshoot during a cold start. In most lightbulbs, such a "deadly thin spot" is a deterioration mode of the filament that accelerates at a rate worse than exponentially (during operation) once it becomes significant. This means that for most lightbulbs, when they become unable to survive a cold start their operating hours are numbered. And for most (but not all) lightbulbs, cold starts do zero to usually-negligible damage until the filament has aged enough for a cold start to be fatal.

True, but they usually don't blow out immediately unless the overload is very severe. Mild to moderate overload merely shortens their life.

True - the power rating of a dimmer usually assumes that there are no adjacent dimmers adding heat (of just a couple to a few watts - that is significant!).

Current surge from burnout is often worse than the current surge of a cold start. Often when the filament breaks, an arc forms across the gap. The arc can be encouraged by the voltage gradient across the filament to expand and go across the ends of the filament, in which case the filament is no longer limiting current through the arc. This is what causes the "bright blue flash" that sometimes occurs during a burnout, especially a burnout during a cold start (when the filament resistance is less and allows more current to flow through the arc which makes the arc hotter and more conductive). Most lightbulbs have fusible links in one of their internal lead-in wires so that a "burnout arc" does not pop a breaker or blow a fuse. However, this may be inadequate for protection of dimmers.

I believe probably true. It gets more uncertain when you have a 600 watt or 540 watt load on a 600 watt dimmer, and it gets worse when you put more than one dimmer in the same box since each one adds heat to the others (despite the loss in each dimmer being only a few watts).

- Don Klipstein ( snipped-for-privacy@misty.com)

Reply to
Don Klipstein

Hardly ever true - more like 1-2% of the line voltage times load current becomes heat.

Usually the heatsinks are for dissipation of heat amounting to only 1-2% of full load power!

Please consider that a 40 watt soldering iron, maybe as little as a 15 watt one, can get the heatsinks much hotter than any normal operation would!

A 100 watt bulb dimmed to half output consumes about 74 watts. With losses in the usual dimmers, this amounts to about 75 watts.

Most likely:

  1. The dimmers were not addequately conservatively designed
  2. Dimmers were placed close to each other or in the same box as each other so that they added heat to each other.

However, lower wattage bulbs dimmed less do indeed give the same light for less power consumption and less dimmer heating than you get with higher wattage bulbs dimmed more.

- Don Klipstein ( snipped-for-privacy@misty.com)

Reply to
Don Klipstein

I was reading some promiotional stuff on the Lumileds website. They pointed out that the white LEDs have the advantage that they can be dimmed, and the color temperature doesn't change, in other words they don't have the disadvantage of incandescents that as they are dimmed, the amount of light per watt diminishes.

Reply to
Watson A.Name - "Watt Sun, the Dark Remover"

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.