flash bulb firing current characterization?

Hi all,

Does anyone out there have no-fire/all-fire characterization data for flash bulbs, specifically AG1/AG1b?

I am trying to improve my launch controller design to be "flash bulb safe" but solid data on the above is proving elusive.

I found one mention on the web that "5 mA" is flash bulb safe, but when I did my own test it took around 220 mA to fire, so I suspect the 5 mA is way too conservative. OTOH, I don't know what the manufacturing variabilities are. I'd rather not burn up my stock of bulbs to find out if the data is already available somewhere.

TIA

Reply to
bit eimer
Loading thread data ...

I doubt if anyone has measured a no-fire current on flash bulbs for a couple of reasons:

1) It consumes a lot of flash bulbs. 2) It requires a level of math skill that most do not possess. Hell, I still have no idea how to compute a MLE. :-)

The usual method of determining a no-fire or all-fire current is to assume that the variation between items has some random distribution (usually normal) and then try to estimate the mean and variance. The no-fire current level will then be the mean minus three times the standard deviation.

I suspect that the variance for flash bulbs will be pretty high which will result in a very low no-fire current result. Also keep in mind that this is a statistical result. Some items will function at lower currents than the no-fire current so add a safety margin.

If you really must know more, try:

formatting link
As near as I can tell, all of various methods (Bruceton, Neyer) use the same Maximum Likelihood Estimate to compute the statistics. The variations are in how the exposure levels are determined.

bit eimer wrote:

Reply to
David Schultz
1) I've heard that many flashbulbs fire reliably on 50 ma

2) Is there really any drawback to using a very low current? Why not be very conservative? Some systems use less than 1 ma.

Reply to
David

For continuity my system is already flash bulb safe (0 mA). Am looking to see if I can measure actual resistance - to do that I need non-zero current. The more I use, the more precisely I can determine the resistance. But to safely maximize the current I need to find the max no-fire current and then provide a guard-band.

Am thinking of doing a little test jig: starts at 1 mA for 1 sec; 3 sec pause; increase by 1 mA; 3 sec pause; etc. until the bulb goes. Might take 20 mins to test a bulb, but at least it would be automated. I've got Sylvania AG1 and AG1b, GE AG1 and AG1b, and Westinghouse AG1b. Does anyone know of other AG1/AG1b manufacturers that I should try to get and test?

Reply to
bit eimer

0 mA test. Good trick. An impossible trick.

Perhaps you meant less than 1mA?

This test protocol will likely give inaccurate results.

Exposing the flash bulb to a current that does not cause it to fire can/will cause changes to the bulb. These changes will alter the actual firing current.

This is why so much thought has been put into sensitivity tests. You expose each item once and then throw it away even if it didn't function.

Reply to
David Schultz

OK, true it is not 0.000 mA. The input leakage current spec is +/- 1 uA, so that would be 0.001 mA.

Reply to
bit eimer

I suspected as much after reading one of the papers in the link you provided. OTOH, until another source of the data appears, I looks like I'll be left my own devices (or should I say DUTs :^)

And since I don't intend to sacrifice 100's of flash bulbs in the pursuit of perfect data, this was what I came up with. Alternatively, I might try to first roughly determine the distribution with the method I described, predict a reasonable max-no-fire current from that, and then test a few units against that prediction.

Can anyone think of a degradation mechanism that would cause a flash bulb to "build up resistance" to trigger current as a result of exposure to low levels of current? E.G. assume a virgin flash bulb would have triggered at

25 mA. But the test sequence starts at 1 mA and slowly ramps up with the end result that the bulb flashes at 50 mA instead of the expected 25 mA.

If the degradation actual causes the bulb to flash earlier (i.e. at a lower current level), that just builds more margin into the max-no-fire current level.

Reply to
bit eimer

To add a bit more to the 'equation', I strongly suspect that even within the same lot, bulbs will fire at +/- 10% of some nominal value. Lot to lot, it's probably more in the +/- 20% range. And given the age of the items, you could even be looking at +/- 40% (or more!) variations.

I'd chalk this one up to 'the law of diminishing returns', and use a step method to determine a 'rough' value for measuring the current. Then, I'd use about one-tenth of that as my 'safe' value (and even then, as others have indicated, I'd use the lowest value 'necessary'). It's not quite true that the more current, the more accurate your measurement

-- it may be true of your instruments, but it's not an across-the-board truth.

And as indicated, such information is frankly pointless -- it would only make sense if you were the original manufacturer trying to really characterize the bulbs for production purposes. For someone trying to use them many years later, the variation bulb-to-bulb will be so great as to make any highly accurate techniques pointless.

File this one under "required number of significant digits"...

David Erbas-White

Reply to
David Erbas-White

Thanks for the thoughts. You could well be correct and I may indeed give up on the idea of building in igniter-resistance measurement. That feature in itself may not have enough benefit to warrant the effort of characterization, let alone design.

OTOH, it sure seems like there are plenty of references to how sensitive flash bulbs are, that they require very little current, yet I still haven't found any documentation of what those levels really are. Some launch controller providers claim to be flash bulb safe, and I don't really doubt them, but it sure doesn't seem that they had any real basis for making that claim. It would be nice to at least get a handle on it.

Reply to
bit eimer

Well, as you pointed out earlier, it is a whole lot easier to determine continuity than it is to get an accurate reading of the resistance. It's not hard at all for an altimeter/timer to be flash-bulb safe; they use very low current to measure continuity.

If you really want to measure resistance, I'd use low current, but use an op-amp or something to multiply the current so that you can get a good reading through an accurate, high resolution A/D. I don't know why you want to test a bunch of flashbulbs. Even if you determine some range of firing current, you still don't know the relationship between the resistance and the likelihood of firing or the firing current. That relationship might be just as variable or even more variable than the firing current itself.

Reply to
David

Hi David,

Heres is what I do for the trick.

eliminate the test.

anytime I put current into an igniter, I intend it to fire !

why put current into something designed to fire on current when you don't want it to fire ?

I know a few club systems built this way, no testing, just firing only.

actually I see less misfires then on most systems with testing built in ;-)

Reply to
AlMax

No test is not equivalent to a 0mA test.

"If you must compare apples and oranges, d>

Reply to
David Schultz

You miss my point ?

I was wondering about your opinion on the use of no continuity tests as being the most safe.

Reply to
AlMax

Would disagree when clustering, assuming one can do independent continuity verification for each igniter.

Reply to
bit eimer

Hi Bit.

well, a continuity test on the pad will read OK if only one igniter is working since they are in parallel so it does not help much.

test each e-match , dipped igniter, or even solar igniter with a fluke BEFORE putting it in the motor.

(Don't try this in your motel rooms in the corn belt guys)

make sure your GSE can put out at least double the current your cluster setup needs, and you'll be fine

Reply to
AlMax

No test still has leads back to the pad that can pick up RF and supply low levels of signal to the ignitor.

Bob Kaplow NAR # 18L TRA # "Impeach the TRA BoD" >>> To reply, remove the TRABoD!

Reply to
Bob Kaplow

Hi AlMax,

I guess I wasn't quite as clear as I tried to be. What I meant with "assuming one can do independent continuity verification for each igniter" was that while the launch controller fires all the cluster igniter simultaneously (in parallel), they are not wired together so that one "good" igniter makes them all appear good. Its like having multiple normal launch controllers, one for each igniter, each testing its own igniter's continuity, but set up so that pushing one launch button activates all the controllers at the same time.

Reply to
bit eimer

Unless its an RF system to begin with. :^)

Reply to
bit eimer

With flashbulbs, I completely agree with this. Why use a continuity test with flashbulbs; they either work or they don't. The main reason they don't work (95%+) is because they are no longer air-tight and the stuff inside (magnesium, oxygen) has been contaminated. If this is the case, the contaminated bulb won't work, period. A bad flashbulb will measure a similar resistance as a good one. The way to get around this is redundancy; use two flashbulbs in your firing circuit. Flashbulbs still aren't that hard to obtain and are inexpensive. When I fire flashbulbs with my launcher, I hold down the "launch" button, do the countdown and then insert the safety key to launch the rocket. This way, my launcher won't accidentally fire the flashbulb. If you want specs:

AG-1 bulb resistance: 1.75 ohms

I always have a hard time firing flashbulbs with one "AA" battery (1.5volt).

Two "AA" batteries (3volts) fire flashbulbs very reliably.

Basically the voltage firing the flashbulb is the energy required to get the Mg+O2 reaction started. Flashbulbs don't fire from the heat that the firing voltage causes inside the bulb. The firing voltage actually creates a spark inside the flahbulb that starts the Mg+O2 reaction. This is why a flashbulb can not be regarded as a resistive load like other igniters.

I think that there is a website that states that flashbulbs under test consume 2+ amps when firing. This is really misleading as people might think that a flashbulb requires this much current to fire, which is far from the truth.

As a side note, I love advertisements from companies that claim their launchers have fail-safe continuity checkers, in order to "avoid EMBARRASSING mis-fires at the pad"!

#1) If somebody gets that embarrassed with a pad misfire, then they probably should go to a shrink for some therapy.

#2) As a lot of rocketeers are somewhat overweight (including myself) a rocket mis-firing on the pad is probably beneficial, so the person can get a little more much needed exercise! Daniel

Reply to
dafalb2001

Hi Daniel,

Appreciate your response, but (as is typically of rmr threads) my original reason for starting this thread seems to have been lost in the flurry. :^)

It is/was not my goal to specifically test continuity of flash bulbs. Rather, I was thinking of modifying my launch controller design to be "flash bulb safe", not because I plan to use flash bulbs, but who knows? It is already designed to do channel-independent continuity checking, which BTW, is not to prevent embarrassing mis-fires but to prevent potentially dangerous partial-ignitions of a cluster .

So, I was simply trying to quantify the current required to fire a flash bulb, which nobody yet has been able to provide. More comments below.

Reply to
bit eimer

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.