Connect with us

LED voltage rating

Discussion in 'Datasheets, Manuals and Component Identification' started by David Parker, Jul 31, 2010.

Scroll to continue with content
  1. David Parker

    David Parker

    13
    0
    Jul 31, 2010
    Hi. This is my first post here and I am fairly new to electronics. I am building a small project in which a microcontroller turns LEDs on and off. Each LED has a current-limiting resistor of 470 ohm. The supply to the microcontroller is 5 volts so the output from the pins is also 5 volts. I want to buy some LEDs but there is a bewildering range of voltages specified (input, ouput, operating etc). When I look for 5 volt LEDs the search brings up nothing. Please explain what these voltage ratings mean so I can choose the right LEDs for my project.
     
    Last edited: Aug 1, 2010
  2. Mitchekj

    Mitchekj

    288
    0
    Jan 24, 2010
    The voltage spec you're talking about is called the LED's forward voltage. This is the voltage required across its anode-cathode to get a certain current flowing through it. Is your microcontroller (henceforth: uC) a constant voltage output of 5Vdc? I don't know much about micros. :( But it would make sense to be a voltage source, not a current source.
    You're probably looking for an LED rated for ~20mA. Different colors will have different forward voltages (Vf). Different semiconductor chemistries will have slightly different Vf specs, too. All pertinent specs should be spelled out when you buy them.

    In any case, that's what the resistor is for... to drop the 5V down to something the LED can use, like 2-3Vdc. You'd need to know the current from the uC outputs, and the voltage range of the LEDs you pick, to calculate the correct resistance.

    Any LED should work fine. There aren't any (single die) LEDs which would require 5V or more. For instance: a 3.3Vdc forward voltage, 20mA forward current...

    Vin - Vf(LED) = Voltage you need to drop over the resistor
    5V - 3.3V = 1.7

    V / I = R
    1.7 / 0.020 = 85 Ohms

    Now you're left w/ 3.3 Volts at 20mA through the LED.

    The LED specs will give you a range of voltages, or at least they should... something like 3.2 - 3.6Vdc forward voltage, with a 3.4Vdc typical. I'd use the lowest number when calculating for the resistor.

    For use in the future of your electronics path: An LED should be driven with a constant current and not a constant voltage, for a number of reasons, but mainly since the forward voltage will actually drop when the die starts to heat up... this makes the current go up, it heats up even more, and the cycle repeats itself. This isn't too much of an issue with small low-power LEDs, as they reach stasis fairly quickly and at a low temperature. But for high power LEDs this starts becoming an issue, thermal runaway.
     
    Last edited: Aug 1, 2010
  3. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,412
    2,780
    Jan 21, 2010
    Furthermore, you shouldn't think of LEDs as devices which require a voltage, but ones which require a current.

    The forward voltage drop specified is a parameter into the calculations you use in determining the value of the series resistor which will set that current at some supply voltage.

    Mitchekj gives those equations.
     
  4. David Parker

    David Parker

    13
    0
    Jul 31, 2010
    The microcontroller is a PIC18F452. The data sheet says:
    Peripheral Features:
    • High current sink/source 25 mA/25 mA
    So I presume that this means that each pin can suply up to 25mA. I don't know if it has a constant voltage output.
    From your explanation it sounds like I should choose LEDs with a forward voltage rating of less than 5 V and then calculate the size of the series resistor in the way you suggest to get the optimal current for the LED. Thanks, that makes some sense to me. I now see "forward voltage" listed on the website for the LEDs I was looking at.

    Here is an example to see if I got it right. I found an LED with an "operating voltage" of 1.6VDC. So the required voltage drop over the resistor is 3.4V. The LED specifies an "operating current" of 1mA. To achieve this current through the resistor I use V=IR: 3.4=0.001*R, R= 3400 = 3.4kOhm. Is this right?

    Thanks, David.
     
    Last edited: Aug 1, 2010
  5. Mitchekj

    Mitchekj

    288
    0
    Jan 24, 2010
    That sounds right. As long as the resistor is in series w/ the LED, you'll have 1mA through both. The datasheet should list a min/max Vf range (called a forward voltage bin) which all depends on what particular batch of production you purchased. 1.6Vdc is most likely their 'typical' Vf. Some measurements would be in order if you wanted to precisely set the current, but I don't think it will hurt anything in your particular case here to use the 'typical' values for resistor selection.

    Some resistor specs to keep in mind:

    3.4kOhms isn't exactly a common value... and 1mA doesn't leave a lot of wiggle room in the selection, either. Is there a "max forward current" rating? 3.3k and 3.6k are 'common value,' and easier to find. You can find a 3.4kOhm 1% resistor at some suppliers, tho.

    Power dissipated over the resistor will be 3.4 * 0.001 = 0.0034W
    So, any size will work; 1/10W, 1/8W, 5W, no matter. :) Size and cost would be the limitations there.
    1% tolerance would be ideal, but 5% should work nicely otherwise.
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-