I know this is probably a stupid question but I'd appreciate any clarification anyone could give me. Let's say I have a 10K potentiometer (using only two legs as a variable resistor) that is hooked up to the anode of an LED whose cathode goes to ground like so. Let's say the supply is 5v and the LED has a voltage drop of 2v. Am I correct to assume that the pot when at it's lowest resistance will need to dissipate 5v and not 3v? It seems like that would be the case but again I only have the slightest notion of what I'm talking about. From various books I've been reading it seems like it doesn't matter where the resistance is in the circuit but it seems like it would here. Thanks, -Lee
A. No, at the very least it should drop 3v, the LED will take care of the other 2v. This isn't really all that important, the important thing is current, which is what kills LEDs (if you had a voltage source that exactly matched the diode's forward voltage drop, it'd still fry, as there's nothing to limit the amount of current going through it). B. It doesn't matter whether the pot is in series with the LED's anode or cathode, just as long as it is there.
Well in the actually circuit there is already resistance to limit the current to the LEDs. What I'm trying to do is I want to control the overall brightness with a pot. I know this isn't optimal and as soon as I get some more time I will redesign for PWM. I'm trying to figure out how much the pot will have to dissipate when it's turned down to it's lowest resistance. I know 1/4w will work as that's what I have now but I want to replace them with smaller ones that happen to be 1/10W. Trying to divine wether it will smoke or not. Thanks, -Lee
You need a fixed resistor in series with led & pot so when the pot is turned right down the current is still limited. If you put a 100R resistor in, the led current will be limited to 20mA (2v/100R). Turn the pot up full and the current will drop to 2V/1100R = 1.8mA. Dimming effect will look best with a log (audio taper) pot Now, wattage. It's current that smokes pots same as leds. The pot coating has to carry a bit over 19mA at worst, so with a 1k pot it needs to be IxIxR = .019 x .019 x 1000, 0.36W if you want to use the full track length, so your 1/4W has already been overclocked a bit... Use a bigger fixed resistor if you want to use 1/10W pots. You need to limit the maximum current to 10mA so a 200R or 220R fixed resistor would be OK, the led will still be lit at 0.8mA.
Thanks for the help. I must admit that I'm slightly more confused then before though. Wouldn't you calculate the resistance with the (supply voltage - the voltage drop)? In this case 5V - 2v so 3v/1000? What I actually have is 150 ohm resistor to deliver 20 ma from the 5v supply for a single led. I came to this from 3 / 0.020 = 150ohm. Actually the dimmer is feeding up to 6 strings of leds. Each string can have 1 to 6 leds in parrallel. So taking a strand that has 4 leds in it, that would normally draw 80 ma, does this mean (0.079 * 0.079 * 1000 = 6.24W?), if so, I'm probably pretty lucky they haven't smoked. Seeing they haven't, I'm guessing I'm understanding something rather important incorrectly. Actually the pots I have are 10Kohm. So even if it is only drawing 20ma, wouldn't that mean (0.019 * 0.019 * 10000) = 62.4W??? Any enlightenment would be greatly appreciated. Thanks, -Lee
my bad. Might well be me. This is where theory and practice drift apart. Thinking about it, if you move the pot only 1% of the travel, you've an extra 100R in the circuit so current is down from 20mA to around 12mA. 2% travel and its only 8.6mA. I'm guessing if you moved the pots very, very delicately up to the lowest resistance end, the coating would overheat there, but get a bit away from that end and you're safe.