Maker Pro
Maker Pro

Hypothetical Question on Potential and Resistance

chopnhack

Apr 28, 2014
1,576
Joined
Apr 28, 2014
Messages
1,576
A thought occurred to me the other day while I was reading some posts and looking over some schematics. I thought I might have understood the function of resistors in the scheme of things, but just like that, it seems to have eluded me...

If we had 120v DC potential across two wires and tried to wire in a LED with series resistor, would a 6k ohm value pass 20mA of current through the LED? Would it matter that the potential between the two points was far in excess of the typical volts required for a LED? Or would the resistor take up that much potential if sized suitably to dissipate the difference?

Bear in mind, this is hypothetical for theory and understanding, I have no generator or means of producing said value, we could easily have said 600v and substituted in a 30k ohm resistor ;-)
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
In this case let's assume the forward voltage of the diode is zero (because it's small in comparison with the voltage we're trying to drop across the resistor.

V = I × R
therefore

I = V ÷ R
= 120 ÷ 6000
= 0.02 A
= 20 mA
So yeah, the maximum current that would pass through that resistor if it was in series with a LED and across 120VDC would be 20mA.

In actual practice the current would be slightly lower due to the voltage drop across the LED, and then it would vary a little wit the exact value of the resistor (remember it might only be within ±5% of the marked value) and of course the value you select might be 6k8 because that's easily available.

Your calculations are also correct for 600V and 30kΩ.

You might like to calculate the approximate dissipation in each resistor.

use :
P = V² ÷ R​
 

chopnhack

Apr 28, 2014
1,576
Joined
Apr 28, 2014
Messages
1,576
Thanks for confirming that Steve :)

120v
2.4W - sounds a bit impractical but doable. I guess a large ceramic type resistor or a wirewound could handle that much energy disapation.

600v
12W - impractical, certainly would want to convert power with a step down transformer first.

Would the potential between the resistor read 120v/600v in either case? Or would the LED drop the voltage from 120/600 to its rated requirement so long as the current passed is within spec?
 

shumifan50

Jan 16, 2014
579
Joined
Jan 16, 2014
Messages
579
To dissipate the 2.4w with lower rated resistors, you could put 2 x 3K resistors in series and that would halve the heat dissipated by each resistor, so you could get away with 1.2w or 4 x 1.5K you could get close to using 0.5w resistors.
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
If the LED has a forward voltage of Vf and the supply voltage is Vs then the voltage across the resistor is Vs - Vf.

In your case, if you were using a white LED,then the voltage across the resistor would be 120 - 3.4 volts or thereabouts.
 

chopnhack

Apr 28, 2014
1,576
Joined
Apr 28, 2014
Messages
1,576
If the LED has a forward voltage of Vf and the supply voltage is Vs then the voltage across the resistor is Vs - Vf.

In your case, if you were using a white LED,then the voltage across the resistor would be 120 - 3.4 volts or thereabouts.

Now with that in mind, would the LED be burnt out by excessive voltage? Is that the right way of thinking of this?
Or by controlling the milliamps running through the LED would we keep the device safe?

Another train of thought says its more complicated and needs to account for the total power thus the following question:

Would we need to calculate the allowable total milliwatts the device could handle and then select a resistor? i.e. - if the LED can handle 20mA at 3.4V or 0.068W so at 120V would I need to be looking to match the power being used - 0.068W = 120v at 0.56mA or a 214k ohm resistor?
 

BobK

Jan 5, 2010
7,682
Joined
Jan 5, 2010
Messages
7,682
The power is split between the resistor and the LED. In your example of 120V and a 6K resistor, the current will be:

(120 - 3.4) / 6000 = 19.4mA.

The total power is 120 * 0.0194 = 2.33W

The power in the LED is 3.4 * 0.0194 = 0.07W

The power in the RESISTOR is (120 - 3.4) * 0.194 = 2.26W

So they add up to 2.33W.

Bob
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
In some respects you can think of a LED as unable to be damaged by voltage in the forward direction. This is because it limits the forward voltage through itself.

However, it can be destroyed by excessive forward current or power.

Let me give you an example. Imagine you had a 6000 volt power supply that was capable of 10mA. If you connected that to the LED the right way around, the power supply voltage would be dragged down to the LED Vf, and 10mA would flow through the LED. This might damage the power supply, but not the LED.

Practically though, the LED would be in some peril because any capacitance that was charged up to that 6000V would first discharge across the LED. That current could vaporise the LED.

In the reverse direction, voltages as low as 7V could damage the LED. In some circuits you see a reverse biased diode across the LED. This is to protect it against reverse voltages.

In your case, the LED has a relatively fixed voltage across it, the balance neccesarily being across the resistor. Because you have an almost constant voltage across a fixed value resistor, you can calculate the current which will flow through it and hence the LED.
 

chopnhack

Apr 28, 2014
1,576
Joined
Apr 28, 2014
Messages
1,576
Thanks Bob! It's getting clearer :)
Awesome Steve!! So for LEDs controlling the current is far more important than the Vsupply - and that is only by virtue of the device being a modified diode. I think I get it! Thank you!
 

BobK

Jan 5, 2010
7,682
Joined
Jan 5, 2010
Messages
7,682
Yep, that is correct, and for higher power LEDs, a constant current supply is the best way to go.

Bob
 
Top