# LEDs (and diodes?) have CONSTANT voltage?

Discussion in 'LEDs and Optoelectronics' started by NuLED, Jun 23, 2013.

1. ### NuLED

294
0
Jan 7, 2012
Unlike resistors, my measurements indicate that diodes have a fixed voltage level across.

I take it, this is the Forward Voltage? (And anything above 0.7V to the Forward Voltage will light up the LED, at varying levels of brightness, correct?)

So with this fixed voltage drop, if I have a resistor in series, the resistor seems to take the "rest" of the voltage (input voltage minus whatever drop is used by the diode) which can fluctuate (and the voltage across the LED will not fluctuate, but the voltage across the resistor will).

So, for example, LED takes 2 volts. If I stick in 9 volts, the resistor takes the rest, which is 9 - 2 = 7 volts.

If I put in 12 volts, the resistor takes in 10 volts (12 - 2 = 10).

Is this correct, and is this a phenomenon of all semiconductor materials?

(contrasted against resistors, which divide the voltage according to their Ohm ratios between them).

All in series described above.

2. ### BobK

7,682
1,688
Jan 5, 2010
No. A diode or LED has a current that is exponential in the voltage. Look at the IV curve for an LED. The Vf of an LED is the voltage at which it will pass the rated current. You can always place more voltage across it, but you will probably destroy it. The resistor is sized to drop exactly the difference between the supply voltage and the LED forward voltage. And increasing the input voltage will increase the voltage across and current through the LED.

http://en.wikipedia.org/wiki/File:Diode-IV-Curve.svg

You can see from the curve, that the current will increase rapidly with an increase in voltage across the diode. The resistor in the circuit fights this tendency because, as the current increases, it drops more voltage, providing a negative feedback to change in current. But it is far from a constant current (or constant voltage) source.

Bob

3. ### NuLED

294
0
Jan 7, 2012
Hi Bob - thanks for the explanation and the graph. The graph makes complete sense to me, and visually confirms my understanding but I am puzzled because on the breadboard, it seems that I could never get the LED voltage higher than 2 volts (for the particular LEDs I was testing) even when I changed Vcc and the resistors. I didn't measure current; I would assume current will go higher or lower, and I understand that if current exceeds the rated max current for the LED it would get damaged.

It's just the fact that on my DMM it seems to show the voltage not varying above the 2 volt level even as I vary Vcc and resistance. Maybe I am not going high enough on the Vcc or low enough on the Ohmage - I will try to measure current as well the next time I attempt the experiment and see what numbers I get. Maybe I will try to destroy an LED to see. (Unfortunately I do not have the datasheet for a bunch of LEDs I got but all of them seemed to drop only 2 volts).

(On the graph, it seems to show current does not vary much if the voltage is within the specifications; I wish I had a very finely tunable resistance but I only have some cheap pots. No decade box.)

4. ### BobK

7,682
1,688
Jan 5, 2010
The graph shows that the current varies greatly with small changes in voltage. So if you double the current though the LED from its operating point, the voltage might go from 2.0V to 2.1V, or something like that. This is why we don't try to drive LEDs with a voltage, but instead use a resistor to control the current.

Using a voltage much higher than Vf and a resistor to limit the rated current is actually a poor man's constant current source.

Bob

5. ### NuLED

294
0
Jan 7, 2012
OK I found a small pack of LEDs with some specs on it.

Let's see if I understand this correctly.

The specs say 3.0 VDC, 20 mA - 40 mcd

(I don't know what the mcd means).

So E = I x R

If we use 12 V DC as the input voltage

12 = 0.02 amps x R

R = 600 Ohms

So if I have 600 Ohms of total resistance in the 12 V circuit, I should have the right current across everything (including the LED) to not fry it.

But what is the resistance of the LED so that I can subtract that from the 600 Ohms and get the resistor needed to take up the rest of the voltage?

6. ### NuLED

294
0
Jan 7, 2012
(ok I googled a bit and know now the 40 mcd is milli-candela, a brightness rating)

7. ### duke37

5,364
771
Jan 9, 2011
You were close in your original explanation. As Bob says, the voltage across the led will change little with a change of current so cannot be considered a resistance.

With 12V supply and 3V or so across the led, the resistor has to drop about 9V.

8. ### NuLED

294
0
Jan 7, 2012
So in that case, 9V giving 20 mA means the resistor needs to be at 450 Ohms.

E = I x R
9 = 0.02 x R
R = 9 / 0.02
= 450

Is that correct? (Input voltage minus LED voltage requirement, then find resistance based on max current of LED specifications).

9. ### BobK

7,682
1,688
Jan 5, 2010
You got it! 470 is a standard value, so you would normally use 470, which also gives you a little safety margin.

Bob

10. ### NuLED

294
0
Jan 7, 2012
Great! Thanks for all your help.

A follow-on question:

For 12V, if I have 4 of these 3V LEDs in series, do I still need resistors? They drop a net of 4 x 3 = 12V across all of them.

11. ### Raven Luni

798
8
Oct 15, 2011
You always need resistors. That 'remaining' voltage even if close to zero (it can never actually be zero) still has a current associated with it and that current will still be very large unless there is a resistor to limit it.

25,496
2,837
Jan 21, 2010

13. ### NuLED

294
0
Jan 7, 2012
OK thanks Steve. I'd totally forgotten about that so thanks for referencing it again. Maybe this time around I will understand it more (I'd read it a long time ago once).

14. ### NuLED

294
0
Jan 7, 2012
Thanks guys - I did more breadboarding and the current does stabilize after a while, going up as the LED heats up, but then plateaus.

I varied the resistance slightly and the current varies accordingly.

15. ### (*steve*)¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥdModerator

25,496
2,837
Jan 21, 2010
The problem is, with thermal runaway, the current doesn't stabilize.

16. ### gorgon

603
24
Jun 6, 2011
This depends on your supply, and the max current for th LEDs. In the first place, if your 12V is less than the total Vf, the LEDs will not light at all. If the 12V is a little bit higher than the total Vf, the LEDs will shine bright and then it depends on how much current and how stable the output from the power is. If the current in the LEDs is higher than the max limit, the LEDs will heat up. One of the properties of LEDs is the negative temperture coefficient, this mean that the Vf of the LED will fall when the LED is heated up. The result is the possibility for a thermal runaway, resulting in a burning LED.

So in short, its a gamble. I may work out ok many times, and the next time the magic smoke is rising.

17. ### BobK

7,682
1,688
Jan 5, 2010
The resistor is what provides the safety if the volages are a little off or the LEDs overheat (which causes them to draw more current). So, yes, you still need a resistor. But if you put 4 + a resistor and use 12V you will not get sufficient voltage across the LEDs. So you would use 3 in series + a resistor.

Bob

18. ### NuLED

294
0
Jan 7, 2012
OK guys so let me see if I get it here. Thermal runaway bad. OK, got that. But here is the question: If the voltage source is even, say, 0.01 volts higher than the LED max voltage, thermal runaway will happen? (I mean, even with resistors). And then it is a slipper slope, as the Vf keeps getting reduced, and as a result, LED gets even hotter, rinse & repeat. Is this correct? Because due to manufacturing tolerances, LEDs would often (?) be a little bit lower than the stated Vf spec.

?

Another way to say what I am asking is, does thermal runaway happen ANYWAY no matter the voltage? My understanding so far is the heat dissipation is "managed" within range (below thermal runaway levels) as long as the voltage and power rating of the LED are matched properly (with the resistors taming the max voltage expected from the source).

EDIT: To be more precise, maybe I should ask if the CURRENT is even slightly above the printed spec (in my above example, 20 mA) will that cause thermal runaway to start? Or thermal runaway starts ANYWAY regardless.

Last edited: Jun 24, 2013
19. ### BobK

7,682
1,688
Jan 5, 2010
No, thermal runaway will not always happen. I have used (as an emergency light) a high power LED rated at 3.3-3.6V driven directly by a 3V battery, and you will not get in trouble with that. But you will also not get nearly the full capability of the LED. If you are trying to run it at max output with a voltage source you will almost certainly get in trouble. Mainly due to the wide variability of the Vf from one unit to another.

This is why high power LEDs are typically driven from a constant current source, which varies the voltage to keep the current at the correct level.

And no, exceeding the cuirrent by a small amount is not necessarily going to cause trouble as long as there is a resistor on the circuit to provide the negative feedback.

Bob

20. ### NuLED

294
0
Jan 7, 2012
Ok, I see, because the resistor curve is opposite to the LEDs, and the higher the current and heat, the higher the resistance.  