Maker Pro
Maker Pro

If the source voltage is the same as LED voltage, is a resistor necessary?

geratheg

Jul 12, 2014
42
Joined
Jul 12, 2014
Messages
42
If the source voltage is the same as the voltage drop across an single LED, is a resistor in series necessary? If so, why?
 

Gryd3

Jun 25, 2014
4,098
Joined
Jun 25, 2014
Messages
4,098
Yes and No.
It is always good practice to put in a current limiting resistor to prevent excessive current flow.
Lets just say, in theory you have a 2.5V source, and an LED with a voltage drop of 2.49V. There is 0.01V remaining that has 0 resistance to impede it's travel which leads to a ludicrous amount of current.
Now in reality. Everything has resistance, the LED, the wire(or traces) and the source itself. If the supply voltage is equal to or less than the voltage drop on the LED the resistance in the LED, wire and source can limit the current flow to a safe level. You see this done with cheap LED keychains and LED 'throwies'.
This is never a good practice however, as any variation can cause issues... If you use a battery that has a smaller internal resistance the current will go up... If you use an alternative power source the temperature could alter it's characteristics and cause excessive current flow.

Now in most cases, this will shorten the life of the LEDs.

By using a current limiting resistor, you are dictating what the circuit's resistance should be, and are reducing the amount of 'sway' that the circuit may see with slightly different supplied voltages. This helps keep the current more consistent and helps your LED operate happily ;)

**Special notes.
LEDs are driven by current, and not by voltage so you may find incorrectly labelled 'Voltage Source's for LEDs of a specific type or color. These are typically constant Current devices which will always provide 20mA for example to the LED. Now because voltage and current are directly related, it may appear that there is a constant 'voltage' being provided. The voltage being provided is only there because it just so happens that mathematically, that's what is needed to get the 20mA for example based on the effective resistance of the LEDs.
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
To answer the specific question, if you provide the exact voltage at which Vf is 20mA (say) then no resistance is required.

However Vf varies between devices and with temperature, so you need a way of adjusting that voltage automatically.

And that method is called a "constant current source".

If you supply a voltage well below the rated Vf you may be able to get away with it as self-heating will be minimal, however so will the light output.
 

gorgon

Jun 6, 2011
603
Joined
Jun 6, 2011
Messages
603
One thing you should be aware of is that the LEDs Vf has a negative temperture coefficient. The higher the temperature, the lower the Vf. If you have no current limitation in your design, you may get a current runaway situation, if for some reason, the temperature in the LED is raised.

If your battery is up to it, the magic smoke may occur.
 

geratheg

Jul 12, 2014
42
Joined
Jul 12, 2014
Messages
42
Thanks for all the replies.

This is how I understand it:
An LED rated at 3.6V and 20mA will draw 20mA if supplied with 3.6V. Correct me if I am wrong.
However, as it heats the current draw may vary. And in the real world, exact voltage is not the case.

So a resistor basically acts as a "safety" device in that it limits the current draw of an LED to safe levels, so it does not burn out. In addition, it gives the LED flexibility in case of a varying voltage, and thus protects it.

Another thing I realized is that an LED isn't a resistor like a regular light bulb (which I assume is basically a resistor), correct me if I am wrong about light bulbs being resistors.
Instead an LED is like a diode. Diodes have an exponential looking increase in current due to a slight increase in voltage once it passes a certain point.

If I said anything wrong or slightly off in this post, please correct me. If all is right I think my question has been answered.
 

gorgon

Jun 6, 2011
603
Joined
Jun 6, 2011
Messages
603
Any LED will have a defined voltage range, not an exact voltage value. The 20 mA is only certain if you feed the LED 20mA, then you may have 3.6V over it. You can't do the reverse assumption. As said before the temperature is the main factor for what the Vf is at any moment.

The 20mA in the datasheet, is the current where all the other parameters are defined from, but most of them are ranges or typical values for a given parameter.

If the LED heats, as you say for a given voltage, the current will go up until the internal resistance in the battery compensate for the reduction in Vf. The current will then also increase to an undefined level. The LED may then continue to heat, and then you are on the slippery slope, if the battery is good enough to supply the current needed to compensate the fall in Vf.
 

KrisBlueNZ

Sadly passed away in 2015
Nov 28, 2011
8,393
Joined
Nov 28, 2011
Messages
8,393
An LED rated at 3.6V and 20mA will draw 20mA if supplied with 3.6V. Correct me if I am wrong.
LEDs aren't "rated" for a particular voltage. The best way to look at it is that you need to supply them with a certain amount of current (the light output is roughly proportional to this current), and this current will cause a certain amount of voltage (the forward voltage) across the LED.

The forward voltage varies with temperature and other factors, but is not a tightly controlled specification. Data sheets usually specify a typical and maximum forward voltage for a particular current. Here's one that specifies a minimum forward voltage as well. These numbers cover JUST manufacturing variations.

LED-data-sheet-1.png

However, as it heats the current draw may vary. And in the real world, exact voltage is not the case.
It's better to say that the voltage varies for a given current. The current is the figure that's important, because the light output is roughly proportional to it. The voltage is more of a side effect. Here's a graph of typical forward voltage vs. current for a typical LED.

LED-data-sheet-2.png
So a resistor basically acts as a "safety" device in that it limits the current draw of an LED to safe levels, so it does not burn out. In addition, it gives the LED flexibility in case of a varying voltage, and thus protects it.
Not really. The resistor is there to control the current. Unlike the LED, the resistor has a linear voltage vs. current relationship, so variations in the voltage across it will have a defined, linear effect on the current that flows through it.

Take the LED in the data sheet above. Say we want to run it at 10 mA and our supply voltage is 5V. We start with the typical forward voltage, 1.9V, and calculate the resistor based on the voltage that will be dropped across it at that voltage. This voltage is (5V - 1.9) = 3.1V. Using Ohm's Law with I = 0.01 amps we calculate R = V / I = 3.1 / 0.01 = 310 ohms.

Now check out what happens when the forward voltage is at the extremes.

Minimum forward voltage (1.5V) leaves 3.5V across 310 ohms; I = 11.3 mA
Typical forward voltage (1.9V) leaves 3.1V across 310 ohms; I = 10 mA
Maximum forward voltage (2.4V) leaves 2.6V across 310 ohms; I = 8.4 mA

So even with that wide range of forward voltage variation, the operating current of the LED will only vary by ±16% or less. THAT is the reason for the resistor.

As Steve pointed out, a current source is a better option; there would be NO variation in current with a current source. But a resistor is adequate, especially if the ratio of the voltage across the resistor to the voltage across the LED is high, i.e. the supply voltage is a lot higher than the LED forward voltage.
Another thing I realized is that an LED isn't a resistor like a regular light bulb (which I assume is basically a resistor), correct me if I am wrong about light bulbs being resistors.
Kind of. The resistance of a light bulb varies somewhat depending on the temperature of its filament. But apart from that, yes a light bulb is resistive; it's not a semiconductor.
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
And this is a good read if you want to see how to drive LEDs correctly. It has a section at the beginning describing how they differ from light bulbs (and that does a reasonably good job of describing how light bulbs react to changes in voltage too.
 
Last edited:

geratheg

Jul 12, 2014
42
Joined
Jul 12, 2014
Messages
42
Thanks for the responses.

Steve is that supposed to be a link? It's unclickable. I'll take a look when it's available.
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
Fixed. (thanks)
 

geratheg

Jul 12, 2014
42
Joined
Jul 12, 2014
Messages
42
If you have 5 LEDs in series, and 1 of them blows, would that open the circuit?

Or would the blown resistor just short itself and cause more current to flow through the remaining resistors?
 
Last edited:

geratheg

Jul 12, 2014
42
Joined
Jul 12, 2014
Messages
42
I learned something new about resistors.

Then I realized, I meant to ask about LEDs! :oops:
 

cjdelphi

Oct 26, 2011
1,166
Joined
Oct 26, 2011
Messages
1,166
There's conflicting Information here..

Does a resistor act as a safety device? Kinda in regard to LED's, one important fact is diodes, leds, zener, etc are all effected by heat, the foward voltage will drop and rise.

The resistor simply limits the current (ohms law).
The LED's foward voltage will change with heat.

But voltage and current are proportional if for example just throwing random values out...

Gryd, what were you on about with .1v out.. coin batteries have internal resistance, coin cells have a high resistance ... but I bet on a fresh battery it can damage the led and it will dim faster than it's regulated counterpart.

To really regulate it, you need to measure and control it's current... but a resistor will do fir any low power led..

If all you want to do is light an led - supply exactly 15ma (measure with multimeter) or 2.xvolts.. it will be fine until the temperature changes!

2.0v regulated given to the led, it may consume say 10ma at 20c .. at 40c it might consume 30ma if you carefully selected a voltage to sit in the middle, the led would bright in summer dimmly lit in the winter.

Supplying 5v and using a resistor would give constant current, with an led, the current would vary slightly with the temperature. .

So yes low current leds, use a resistor to limit the current...
 

BobK

Jan 5, 2010
7,682
Joined
Jan 5, 2010
Messages
7,682
The resistor actually acts a negative feedback. If the current goes above the design current, the resistor will drop more voltage at the higher current lowering the voltage across the LED, which is how the current is limited.

Bob
 

Gryd3

Jun 25, 2014
4,098
Joined
Jun 25, 2014
Messages
4,098
Gryd, what were you on about with .1v out.. coin batteries have internal resistance, coin cells have a high resistance ... but I bet on a fresh battery it can damage the led and it will dim faster than it's regulated counterpart.
You are absolutely correct.
What I was getting on about was the fact that LED's are current driven... Not voltage driven. Assuming there is no internal resistance in the battery, a difference of 0.1V and 0Ω is infinite Amperage. This is impossible mind you, and imperfections, and device characteristics prevent this from happening. As the LED heats up, the voltage drop will change, which in turn will allow more current though... this increase in current will cause the internal resistance of the battery to drop more voltage across it. This method is only ever recommended for LEDs that you plan to through away XD
 

kpatz

Feb 24, 2014
334
Joined
Feb 24, 2014
Messages
334
Another thing to consider with those cheap LED keychains, throwies, etc. is that they're usually powered by a small button or coin battery that has a (relatively) high internal resistance. Therefore, the battery itself acts as the resistor. Also, the battery's voltage is close to the Vf of the LED, so the current is limited more-or-less by both these factors.

For a throwaway device, this is fine, but for anything well designed, a resistor or constant-current circuit is needed to drive the LED and have it last.
 

geratheg

Jul 12, 2014
42
Joined
Jul 12, 2014
Messages
42
I observed something that I'm curious about. I used two LEDs that have a clear enclosure, but shine red and blue. My power source was 2 1.5V AA alkaline batteries in series to create 3 V.

I used a 270 ohm resistor.

Using two 270 ohm resistors and wiring the bulbs in parallel worked like a charm.
This worked perfectly:
+ resistor Red -
+ resistor Blue -

However, I know this isn't advised but I did it for experimental purposes want to learn, but when I used one 270 ohm resistor in series with the LEDs like this:
+ resistor Blue -
.................. Red -

What happened was the Blue LED didn't even light up, but the Red one did brightly. Blue one only lit if I took the red one out. The Blue LED was not blown either and I know because I checked after that.

However, wired the same way except a wire instead of the resistor, both LEDs lit up.

Why is this? Shouldn't the Red and Blue LED share the current with a resistor and at least light up dimly?

Edit:
Using a 100 ohm resistor, the Red LED was still brighter, the Blue LED was dimly lit.
Using a 10 ohm resistor, the Red LED and Blue LED appeared to be dimly lit with similar brightness, though they were brighter than the Blue LED was when a 100 ohm resistor was used.
 
Last edited:

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
What you have seen is an exaggerated example of why you shouldn't place LEDs in parallel.

Look in the resources section for a tutorial on driving LEDs.

A LED is not like a resistor where the voltage across the device is linearly related to the current flowing through it.

In your case the difference is so extreme that the voltage across the red LED never rises enough to allow any significant current to flow through the blue one. In this case even with twice the desired current flowing through the red LED, its forward voltage probably only rises from 2 volts to 2.1 volts. This is far below the 3.2 volts across a blue LED as significant current starts to flow.

you might be better off placing both LEDs in series and then the resistors in parallel in series with the LEDs. If you have sufficient voltage, this will force the LEDs to have the same current.

Placing them in parallel allows them to see the same voltage, but as I've explained that is not a good thing.
 

geratheg

Jul 12, 2014
42
Joined
Jul 12, 2014
Messages
42
I read the entire tutorial about LEDs that was linked in this thread.

Ok, so the Red LED has a lower forward voltage and thus draws all the current and that's why the Blue LED wasn't lit? Correct?

Additionally, I think I read one of the best ways to set it up is the first way I mentioned in the previous post. Using a resistor with every LED in parallel (as shown in the first "schematic" in the previous post) to achieve the appropriate current for each LED. Is this the preferred and reliable way to power multiple LEDs using the same power source?
 
Top