Resistence of linear halogen bulb when cold

I am working out the wattage of some of my spare 118mm linear halogen bulbs. I have access to only a regular meter.

So I have measured the resistence of the bulb when cold.

(Q1) Is the resistence likely to change significantly from my cold reading compared to when the bulb is at operating temperature? Is there a rule-of-thumb multiplier for such bulbs.

(Q2) Is the filament material used likely to vary from manufacturer to manufacturer in a way that noticeably affects the relationship between the cold resistence and hot resistence of a bulb? (If you see what I mean.)

FWIW I am in the UK with 230 volt mains and my cold resistence readings are 12.1 ohms, 13.5 ohms, and 8.1 ohms. Presumably the first two are 300W bulbs and the third is 500W.

Reply to
JS
Loading thread data ...

Well, if you look at your 'cold' figures, you have your own answer. Assuming the first bulb is a nominal 300W, then it should draw 300/230 amps. 1.3A. This gives a resistance required of 230/1.3 = 177R. You have measured it at 12.1R. A factor of about 14.6*. It will change though. There are lots of variants of the 118mm bulbs. 100, 150, 200, 250, 300,

500W that I have seen, so your units may not have the simple 300/500 choice that you think. Also some maufacturers spec for 220v operation, while others for 230v. Though in some cases, this is just a 'paperwork' change, in some others, it is 'real'. In the past I have seen figures around 14 to 15*, 'touted' as common.

Yes, there is some variation between manufacturers, but on normal bulbs, the biggest variation is with changes in design. So 'rough service' lamps, will have a slightly different factor to the normal lamps from the same manufacturer. Also 'long life' bulbs will give a different figure again. These generally have a slightly lower filament temperature, and a smaller factor as a result. Your last two bulbs, might be long life variants, against a 'normal' example for the first unit.

Best Wishes

Reply to
Roger Hamlett

Yes, there is a huge change in resistance from cold to hot. The ratio is 16.44:1 between 20 C and 2727 C, but since it is a strong function of the filament operating temperature, any rule-of-thumb that does not specify the operating temperature will have an error. If my math is correct, a

300-watt 230-volt lamp would have a hot resistance of 176 ohms. If it operated at 2727 C, then the cold resistance should be about 10.73 ohms.

Most filaments are made from essentially pure tungsten, with some added materials to make the wire easier to draw and to strengthen the final filament. The change in resistance is therefore dominated by the properties of tungsten. The largest error will be knowing the operating temperature. In fact, the change in resistance of tungsten is so large and so predictable that the ratio of hot to cold resistance is used to determine the operating temperature in many lamps.

Reply to
Victor Roberts

No you aren't! That's a common misconcepetion but UK mains voltage is 240V and will not be changed in the foreseeable future. 230V is the NOMINAL European standard but the standard includes tolerances that encompass the national standards of all the member countries. The point of the standard is that a European appliance should be safely usable anywhere in the EU and not that it will necessarily work properly. You can buy 230V lamps but if used on UK 240V they will burn a bit brighter and have a lifetime degraded to only 55% of their design life. Hence you should make sure that you are being supplied with the correct product for Great Britain. Buying from a reputable supplier is not necessarily a guarantee of this - not so long ago I was supplied a batch of 230V halogen theatre lamps by one of the major UK theatrical suppliers.

David

Reply to
David Lee

Is that reversible ? Would it make sense to order, let's say, beamer lamps in the UK, for 240 V, and have a lifetime of 1.5 times the "European" lamps when used on 230 V AC ?

Thanks for your reply, Ben

formatting link
and
formatting link

Reply to
Ben

Yes, the lower the voltage the longer the life, with qualifications...

  1. Compare design voltage to actual voltage, not just what you think your getting.
  2. Color will shift dramatically. There is now research showing that limited color in light restricts visibility far more than a light meter shows.
  3. Efficiency goes out the window.
  4. If you dim to far halogen lamps suffer. That's a long story...
  5. Greater vibration resistance due to a cooler filament. More I'm sure but those are what comes to mind..

Added bulb life has long been listed as an advantage of using dimmers. Note that most dimmers impose a small voltage drop even when set to maximum.

Reply to
RickR

Ben schrieb:

Yes it is. But you will loose a lot of the luminous flux you need. There is a nice diagram that shows the influence of the voltage to luminous flux, power current and lifetime. You have to got too:

formatting link
In the glossary list you have to scroll down to "incandescent lamp" In the pop up window that opens have a look at the second diagram "Operating characteristics of incandescent lamps".

The problem with dimming of incandescent lamps is, that you reduce power just a little but you loose a lot of light output, that means the light efficiency that is already very low decreases much more than the power reduction.

Regards

Willi

Reply to
Bremecker

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.