Connect with us

Another Newb LED Question

Discussion in 'LEDs and Optoelectronics' started by Paradoxz, Aug 2, 2011.

Scroll to continue with content
  1. Paradoxz

    Paradoxz

    2
    0
    Aug 2, 2011
    I have read 100 tutorials on LED's and looked over many schematics, but none cover my question.

    Do I need to use a resistor if the forward voltage = the voltage supplied?

    x4 LED's
    Forward Voltage: 3.0
    Forward Current 20mA
    Power Source: 12v

    If I was using one of these LED's I could see needing a 470 resistor, but since I am using 4 LED's it's going to equal the supplied voltage. If I do use a resistor, I would not have the required voltage to power all 4 LED's. So, I just don't use a resistor right? But EVERY tutorial say's "ALWAYS USE A RESISTOR"... Is there something I am missing?

    *These LED's will be hooked up in series. NOT in Parallel
     
    Last edited: Aug 2, 2011
  2. duke37

    duke37

    5,361
    767
    Jan 9, 2011
    There is a "sticky" by Steve giving details of how to drive leds.
    In short, you need a resistor and sufficient voltage to drive current through it. The reason for the resistor is that the power supply can vary and the voltage across the leds will vary with temperature, the higher the temperature the lower the led voltage so the higher the led current so the higher the temperature so the lower the voltage so the higher the current, so BOOM.
    A couple of volts across the resistor should be enough to stabilise the current with a stable power supply.
     
  3. Paradoxz

    Paradoxz

    2
    0
    Aug 2, 2011
    I appreciate the reply. But it wasn't too specific. What exactly are you saying? Since with heat the current increases you would maybe recommend a small safety window? Say 1v? So If I have 4x 3v 20mA LED's running on a 12v Battery you would maybe add a 50 ohm Resistor?
     
  4. TBennettcc

    TBennettcc

    292
    2
    Dec 4, 2010
    Is it absolutely necessary to use all four LEDs in series? Would it be possible to have two sets of two LEDs in series, and then put those in parallel?

    Also, LEDs tend to work better with a constant-current supply, not a constant-voltage supply. (The situation duke is referring to is called 'thermal runaway'. It basically means as the LED gets warmer, it pulls more current, which causes it to get even warmer, etc. It's not a good situation for LEDs, which is why a constant-current supply is recommended. That way, the LEDs are getting the proper current, regardless of the voltage fluctuations.)

    So, did you read Steve's sticky about how to drive LEDs? If not, here's a link:

    https://www.electronicspoint.com/got-question-driving-leds-another-work-progress-t228474.html
     
    Last edited: Aug 3, 2011
  5. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,387
    2,772
    Jan 21, 2010
    The problem with LEDs (as explained in the sticky) is that as they get hotter the forward voltage drops.

    So if you wire them up to a particular voltage that just happens to allow them to pass 20mA, they will get warmer due to this current. Then the voltage drop lowers, the current increases, they get hotter, the voltage drop decreases the current gets larger they get really hot, and then they stop working.

    The smaller the difference between the Vf and the voltage source, the smaller the series resistance, and the lower effect it can have on limiting current.

    As a rule of thumb, you probably don't want less than about 10% to 20% of your voltage dropped by the resistor. You want more if your voltage source is not regulated, especially if you plan on running the LED near it's rated maximum current.
     
  6. duke37

    duke37

    5,361
    767
    Jan 9, 2011
    OK to be specific. Four leds at 3V each in series equals 12V.
    Battery equals 12V. Result no cuurent.
    Battery equals 12.1V. Result high current, leds get hot and drop required voltage to 11.9V, very high current. You now have 0.2V across very litle resistance.
    Adding a 50 ohm resistor will protect from over current but depending on the tolerances of the voltages, you may not get any current at all.

    Solution.
    Use three leds needing 9V and 12V battery. You now have three volts to lose, so R=V/I, R=3/.02, R=150
    Try a few sums to see what happens if the battery voltage changes from 11.8V to 12.2V or the led voltage changes from 8.9V to 9.1V. Calculate the led current
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-