"E> "KTØT" wrote in message
"E> news:QcRQb.109029$ snipped-for-privacy@twister.rdc-kc.rr.com...
"E> > : I tried your program with a 12volt input, 3.5 LED voltage, and 3ma. The
"E> > : result was 2833 ohms, which can't possibly be right. The same LED came
"E> > : a 470 ohm resistor...
"E> >
"E> > I don't think you would get much light out of a LED operating at 3 ma. The
"E> > calculation is correct BTW for 3 ma. If I check the current for the 470
"E> ohm
"E> > resistor in the same circuit it is a more realistic 18 ma. Now you may see
"E> > some light.
Not if the LED really is a low-power device!
"E> >
"E> > I'm using Rob Paisley's calculator at:
"E> >
formatting link
"E>
"E> OK - I tried this page and entered 12 volt input, 8.5 volt drop and 18ma
3.5 is what you want!
"E> current. I get 194 ohms and almost 3/4 watt. Still doesn't seem right. I
"E> picked up a supply of 1/4 and 1/8 watt resistors varying between 470 and 740
"E> ohms. Nothing here would seem to work...
The form at
formatting link
might be a
little confusing to people who are not up on electric circuit design
and the terminology used. The 'Voltage Drop Across LED' is the LED's
rated voltage, not the voltage drop across the resistor! It is common
to think of the resistor as a voltage dropping resistor, since that is
what it is really doing. Actually, both the resistor and the LED are
dropping voltage and are voltage dropping devices! Now I know you are
totally confused! The idea is that they are each dropping a
*part* of
the 12volt supply. The trick is to get them to drop the proper share.
How much voltage a resistor will drop is a function of how much current
is flowing. This is where Ohm's Law comes in. A resistor will drop a
voltage (V) equal to the product of its resistance (R) and the current
(I) flowing through it:
V = IR (1)
LEDs don't have the same flexibility. They will drop a specific voltage
across their junctions, irrespective of the current. Extra current just
makes the semiconductor hotter. Too much current and the semiconductor
toasts. Ouch! So we need to adjust the resistor to 'balance' the load
at a specific current. The first step is to determing the shares. We
know the supply voltage (12) and the drop across the LED (3.5). The rest is
dropped by the resister: 12 - 3.5 = 8.5 Volts.
We also know the desired current (the current rating of the LED), which
is said to be 3 mA (.003A). Rearanging (1) above to solve for R gives:
V
R = - (2a)
I
8.5
R = ---- (2b)
.003
R = 2833.3333 (2c)
So we want a 2833.3333 Ohm resistor (or there about).
The power (P), in watts, disipated by the resistor (or any load) is:
P = IV (3a)
P = .003 * 8.5 (3b)
P = .0255 (3c)
So, what you want for a 3.5V, 3ma LED using a 12 supply voltage is a
2833 Ohm resistor. The closest stock 5% resistor will be 3K
(orange-black-red-gold). 1/8 (.125) Watt is more than enough, since the
power disipated is small. Using a
*larger* value resistor will
*reduce*
the current slightly. It is generally better to run the LED at a
slightly
*lower* current than at a slightly
*higher* current. If we
'back solve' for current:
V
I = - (4a) (rearranged from (1))
R
8.5
I = ---- (4b)
3000
I = .0028 A (2.8 mA) (4c)
With a 3K resister the LED will be running at 2.8 mA, which is within
its tolerance and close to its max. It will be ever so slightly dimmer
that its maximum brightness. Better than every so briefly brighter than
its maximum brightness :-(. And a 3K 1/8W 5% carbon resistor is ever so
much cheaper than several 1% resistors...
Spec. sheets for LEDs (and for most
*passive* devices) rate their
devices in 'Volts' -- this really means 'Voltage Drop Across device'.
Since only 'active' devices (like batteries and power suplies) can
produce a voltage, the voltage rating for all passive devices always
means the '[Maximum or Rated] Voltage Drop Across' or else refers to
the dielectric strength (the voltage level that will 'jump the gap' --
typically this relates to capacitors). This is universally understood
by electric circuit designers. The voltage for any 'load' type device
is its voltage drop. Some devices don't have a specific voltage drop
(resistors are like this) other devices have a maximum voltage drop or
an operating voltage drop (LEDs are like this). Other devices have a
dielectric strength, which is the measure of how many volts of
electrical presure it takes to jump the gap (this relates to
capacitors).
If you used too small a resistor
*once*, you could have toasted the LED
-- using the right one later will be no help. Unlike a incandensant
lamp, a LED will burnout in miliseconds -- far too brief for you to see
the flash of burning semiconductor...
*Always* try the largest value
resistor first and only start using lower valued ones if the larger
values fail to give the desired results. Also, LEDs are polarized.
Swaping + and - can make a difference in whether or not the LED lights
up. Wiring it backwards with any random resistor value will produce no
light at all.
"E>
"E>
"E>
\/
Robert Heller ||InterNet: snipped-for-privacy@cs.umass.edu
formatting link
|| snipped-for-privacy@deepsoft.com
formatting link
/\FidoNet: 1:321/153