is there any way to find out how much voltage led's need without raising the voltage until the blow out? id really appreciate any feedback...
Most standards LEDs I've seen were meant to be used at 20mA so I guess you could hook one with a resistor to a variable source or a fixed source with a potententiometer and look at the amps going through your circuit with a multimeter and once it reaches 20mA you remove the ammeter and check the voltage at the LED's leads with a voltmeter... This is how I would do it.
The voltage drop across the led is part of the led specification and it can be assumed to be constant and independent of the current (unless, of course you connect the led directly to a voltage source without a resistor). This is called the forward voltage. The reverse voltage is the allowed voltage limit when current flows in the opposite direction (for example, when you connect the led wrong, changing the terminals). You know that as a led is a diode, it does not allow the current to flow in one sense, unless the reverse voltage is reached. If you want to know the led voltage drop, just connect it to a 5V source with a 200 ohms resistor in series. You can try with different resistor values, and you will see how the led voltage remains constant, and the current flow changes. The resistor voltage drop changes accordenly to the current, so that the led voltage remains unvariable. You can experiment and tell us your results.
I'd have to disagree with that. The forward voltage drop, as with any kind of diode, will change depending on the amount of current flowing through it...
You're right, of course, but the key words are "it can be assumed to be constant and independent of the current" and when you're using stock E12 or E24 resistor values to limit the current it doesn't make a great deal of difference mathematically if you assume a constant 3.6V for a blue or white, or 2.1V for most other colours. The graph does show a couple of points to remember; first, when you're testing for forward voltage drop with a multimeter on the "diode test" position, the reading that comes up is low because the meter test current is very low, and second, the forward current rises rapidly with a very small change in voltage, so a series resistor is almost always needed.
Maybe I should read the posts more carefully Like cpemma said, it's safe to assume some standard characteristics for most LED's: Most GaN or InGaN type LED's will be happy at 3.2 - 3.7V ~20mA. (i.e. white, blue, pink, uv, pure green, turquoise...) Standard green (~568nm types) and standard yellow LED's will normally be happy at 2.1V ~ 20mA. High brightness red types at 2V ~ 20mA.
I think I did not expressed the ideas very clearly. So, here we go again... OK, you are right, but... When it comes to design a circuit you can assume a constant voltage drop across the led. Take, for example, the figures in the diode characteristic you have attached: - We can expect a usual value for the current (that is around 10 to 30 mA). - In the graph we can find that the associated voltage drop for those values of current are: 3,7 to 3,9 V - Hence, we can consider that the forward voltage for the led is 3,8 V (maximum error: +- 0,1V). This error is acceptable for a led design. I am saying all this just for the electronics beginners. . Of course, it is important to know what is the exact behaviour of some electronic stuff. If you know it, then you know when you can simplify things.