I got my tortoise switches all working great. I even have a double crossover that operates from a single toggle. Now, I would like to make some dwarf yard signals to show how the switches are thrown.
I have wired colored LEDs to the switch and they seem to do OK.
Wondering about the proper Ohm resistance for the lights. Currently I am using 1/2 watt 220 ohm and it all seems to work good. I want to be sure the bulbs don't burn out too soon - they're a little too bright.
Increase the resistor to decrease the voltage across the led, thus the brightness. If you have them, a potentiometer works great for a test, then just measure the resistance after you disconnect it, and replace with fixed resistor.
9 volts - 2 volts for the LED = 7 volts.
7 volts / 220 ohms = 32mA for the LED = (for normal LED's) very short lifetime.
550 Ohms should be fine, that gives you 12-13 mA/LED and more than sufficient light.
No, it is the current through an LED that determines its brightness; the voltage across it is relatively constant, being a diode.
Typically, an LED will have a forward voltage drop of about one volt, and as a rule of thumb, I would light it with 10 mA of current. Though, you are likely safe from burning it out anywhere between 5 and 20 mA. You can experiment to see what brightness you get from a given LED for a given current.
To do this, take the voltage you are supplying and subtract 1. Take that value and divide it by 0.01 (multiply it by 100). That will give you the resistance needed to limit the current from that voltage supply to 10 mA.
For example, if the supply is 6 volts, then take 5 and multiply by 100 to get 500 ohms. The nearest resistor would be 470 ohms (or 510, if you have
5% tolerance resistors).
The only other thing to be aware of is the power rating of the resistor. It must be sized to be able to dissipate the power it will draw when the LED is lit. This is the square of the current multiplied by its resistance. In the case above: 0.01 * 0.01 * 470 gives you 0.047 watts, so a 1/4 watt resistor is more than ample.
Power the same LED off of an 30 volt supply and you get the following: Resistor = (voltage - 1) * 100 = 2900 ohms (2k7 is close) Power = 0.01075 * 0.01075 * 2700 = 0.311 watts So, a 1/4 watt resistor is not good enough, 1/2 watt is needed
Modeltog, internet, gratis spambekæmpelse, elektronik og andet:
Voltage drop and current handling ability will vary by LED type.
has a pretty good breakdown on the various types/colors and their ratings.
And when I said I general use 5 - 12VDC with resistors in the 330 -
470 ohm range for LEDs, maybe I should have clarified a bit. I use
330ohm with 5 - 8VDC, 470ohm for 9 - 12VDC. I'm using older LEDs from a bulk pack I bought years ago that have a max current rating of
50ma. None have burned up on me yet.
I was giving a "rule of thumb" example, so for the most part whether one uses 1 or 2 volts, it will have little effect as far as pushing the current capacity of the LED or power capacity of the resistor.
If one wants to do something so close to the limits of the devices, then more accurate calculations using more accurate values for the parts in question will be needed. Generally though, rules of thumb procedures have