I got a bunch of red LED's from FrozenCPU and have been having problems with them. The information I got from FrozenCPU was that these LED's use a maximum of 3v and when used alone on a 12v rail I needed a 150 ohm resistor. I used V = IR to figure that I could use 4 LED's on 12v without a resistor. I hooked up four of the LED's in series and they turned one for about 30 sec and then died. I thought it was a solder/connection problem. I tried again today with four totally different LED's and the same thing happened. (Of note, one of the four LED's was a brighter green LED). As far as I can tell, there must be too much voltage going across each LED and burning it. I don't think it would be current, because they'd only pull as much as they needed from the PSU (Correct me if I'm wrong.) Basically, I'm trying to figure out if my assumption about overvoltage is correct before I go and buy a bunch more LED's (Or if I did something wrong). Thanks, Astaroth
You cannot use LEDs without a current regulator (resistor or active current limiter). I do not wish to explain it... it's a long explanation, and the stock one i use is in swedish...
However, I will explain it. As you correctly figured out, putting too much voltage through a diode is a bad thing. Current, though, kills just as much, if not more, than voltage. When you put the 4 diodes in series directly up to the power supply, there was basically nothing to limit the amount of current going through them, just the resistance of the diodes themselves, which is minute. In no way are your diodes "bad", because they're operating outside of their normal environment. If you want to verify this, get an multimeter, set it to measure current, and place it in series with the LEDs. I'd say you'd get a number probably in the 50-100mA range.
Because I don't need to limit the voltage (4 in series should be 12v), would I use the smallest resistor I can find? Like a 10 ohm 1/4 or 1/2 watt resistor? I would only need one resistor in the circuit to limit the current, correct? Thanks, Astaroth
The resistor has to drop at least some voltage, so if the LEDs used 2.5v (min. voltage stated for 5mm reds on FrozenCPU's site) the equation would be as follows (assuming 15mA current for LEDs) (supply voltage - (sum of LED voltages)) / desired current = resistor value (12 - (4x2.5)) / 0.015 = 133.33 ohms. Nearest standard E12 resistor value is 150 ohms. The power dissipated across the resistor is fairly low ((12 - (4x2.5)) x 0.025 = 50mW) so a 1/4watt resistor will do fine.
There's basically no such thing as a 3V red led. Chances are the typical forward voltage of yours at around 20-25mA is about 2.2V, use that in your calculations. Then measure the actual voltage drop across the leds, and if it's much different you can adjust the resistor to suit. But it won't be much different. Use the 150R resistor hitman012 first suggested, with 4 leds in series on 12V. If I'm right you'll get 21mA, if he's right a bit less. Either way the leds will be fine. 120R will give 27mA, as high as I'd go with unknown quality leds. 82R a bit risky IMO, could end up giving the leds 40mA. It's not 12V though, it's 12 - combined led voltages, 4V tops, so a 1/4W will do.
Thanks guys, 150 ohms worked like a charm! I'm about to run out of the house again, so I'll be going over the math later to make sure I can do this myself next time. Astaroth
Electronics Math strains my brain , I just go here and take the easy way : http://www.casemodgod.com/led_calculator.htm