Connect with us

3W IR LED isn't very bright?

Discussion in 'LEDs and Optoelectronics' started by moeburn, Aug 14, 2013.

Scroll to continue with content
  1. moeburn

    moeburn

    41
    0
    Jun 25, 2012
    Hello! I bought one of these 3W IR LEDs, thinking I could add night vision to my home-made security cameras. I have lots of USB webcams, as well as a couple old android phones with built in webcams, set up as security cameras around my house, and they all seem to pick up infrared light (testing using a TV remote pointed at the camera) just fine.

    So the LED arrived, and I wired it up, but its not very bright. No where near as bright as the people on the comments page claim, they say it lit up their whole room! Mine doesn't appear much brighter than a TV remote LED. Although I can still see the red glow coming from the LED with my bare eyesight (its a near-IR, not far-IR, so it has a dim red glow).

    I'm supplying the LED with a 1.5v alkaline battery, and a 3 ohm 3 watt resistor in series with the LED, and my DMM says that it is drawing about 40mA. Now the web page doesn't say the LED's current, just the voltage and wattage. But shouldn't it be more like 2 amps? Since its a 1.5v 3w LED? I even briefly tried it without a resistor, and it only drew 60mA.

    Am I doing something wrong? Or is this how much current its supposed to draw, and I just can't see its full brightness because all my digital cameras have IR filters on them?
     
  2. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,389
    2,774
    Jan 21, 2010
    You're doing several things wrong.

    Firstly, your battery is probably not capable of supplying the current you desire (and if it is, it will only be for a very short while).

    Secondly, the Vf of the LED at it's rated current is probably higher than 1.5V. Your battery (especially under a significant load) is not going to have a terminal voltage anywhere near this.

    Thirdly, you're running a high power LED from a (poor) voltage source and a resistor. It would be far better to use a constant current source.

    Look in the tutorials section for the LED tutorial. It should give you more information.
     
  3. moeburn

    moeburn

    41
    0
    Jun 25, 2012
    Thanks for your help!

    Hmm, okay, but the people in the comments section all got it working using batteries, because of how hard it is to find a power supply at 1.5v at high current. I don't know if its the Vf, but the product page says the LED is "1.5v-1.7v".

    But am I correct in thinking that the LED should be drawing up to 2 amps, since it says it is 3W? I'm using a NIMH battery, and I know it won't have very good life at a high current, but it should work, I would have thought.

    So, lets say that in theory, if my battery was able to supply 2 amps at 1.5v, then why isn't the LED drawing it? Is it because the amount of current the LED draws at 1.7v is a huge difference from 1.5v? Or is it because, like you said, because even though the battery supplies 1.5v to my multimeter when unloaded, it can't supply 1.5v at such a high current load?

    I have briefly read a few LED tutorials, but none specific to high-power IR LEDs. Thanks for bearing with my noobishness :D
     
  4. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,389
    2,774
    Jan 21, 2010
    It doesn't matter whether it's an IR led or red, or blue.

    The issue is that you need to have a supply capable of supplying the correct current.

    In your case Vf is probably around 1.6V, so your source needs to be capable of at least 1.6V. Then you need to regulate the current. Using a resistor is a VERY POOR way of doing this because as the voltage falls, the current will fall faster. (also because the current can increase as the LED gets hot)
     
  5. moeburn

    moeburn

    41
    0
    Jun 25, 2012
    Okay, so it sounds like I'm going to have to go buy a nice high power 1.6v wall wart, but I'd like to see if this LED actually works first. But would it be safe to use a voltage divider resistor circuit to lower the voltage of one of my existing DC supplies, to test it out? I understand what you mean, that its not a 'regulated' voltage, because it would change depending on the load, how hot the LED is, etc.
     
  6. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,389
    2,774
    Jan 21, 2010
    No, you're going to have to find a LED driver.

    If you had read the article on LEDs you would note that it tells you NEVER to power LEDs from voltage sources.
     
  7. moeburn

    moeburn

    41
    0
    Jun 25, 2012
    ...right, okay so I need a wall wart with a current-sensing dc-dc converter that can handle 2 amps of current, yes? that is a voltage regulator that will send the same voltage no matter what the current draw of the led is?

    how strange... I've been building high power LED flashlights for so long using just batteries and potentiometers, and I've never had a single one burn out. Sure, they got dimmer and brighter in different temperatures, but they never got thermal runaway.
     
    Last edited: Aug 15, 2013
  8. Harald Kapp

    Harald Kapp Moderator Moderator

    10,025
    2,138
    Nov 17, 2011
    No, it is the other way round: you need a current regulator that will drive the same current regardless of the voltage drop across the LED. You are very unlike to find such a driver in the form of a wall wart.
    If you don't buy a boxed LED driver, then the way to go is:
    - find a voltage source capable of delivering at least 2V at 2A (probably you'll need more than 2V, see below).
    - build a constant current source for 2A. Input to the current source is the power from the wall wart. Output is the current to the LED. Here is an example of such a current source. Depending on the details of the circuit you may need more than 2V input voltage.
     
  9. duke37

    duke37

    5,362
    768
    Jan 9, 2011
    If you wish to try the LED at higher output, then why not use two cells in series giving 3V and use a series resistance of about 1 ohm to limit the current?

    If it works, then you will need to obtain a suitable current source.
     
  10. BobK

    BobK

    7,682
    1,686
    Jan 5, 2010
    An NiMH cell is only rated at 1.2V. It is no wonder you are having trouble running a 1.5-1.7V LED off a single NiMH cell.

    I agree both with Steve for the ultimate solution, and with Duke for a way to try it out. However, be aware that the resistor must be a 5W one, and it is going to get hot.

    Bob
     
  11. moeburn

    moeburn

    41
    0
    Jun 25, 2012
    Actually, I was using a fully charged NiMH, which was 1.43v, but yeah, still not in the 1.5-1.7 range, and probably not 1.43v on a load, which I didn't test.

    I didn't mean I'd be looking for a wall wart with a current limiter built in, i did mean I'd have to make one. I see the page you linked to says I can use an LM317?? Great, I have some of those lying around! EDIT - Your page talks about an LM317L, but I have an LM317T, I assume mine is better because it is 1amp vs 100ma, but this LED is 2 amps, would either of them even work? Or do I just need a heatsink?

    But my other option was to built a simple NFET/NPN current limiting circuit, like this one:

    [​IMG]

    Since I really don't want to have to get a proper LED driver (they are more expensive, this cheap LED isn't worth the cost of a good driver, I'd have to order it online and wait for it to be delivered, but I can walk down to my local electronics parts store and build my own in a day, for much cheaper)

    Which one do you think would be "better" for me, the LM317 one, or the NFET one?

    And thanks again for your help, folks! I'm one of those weird people, I've been building electronics projects for over 10 years, and I understand some things very well, and other things not at all, but once it works, I (foolishly) stop "learning why it works" :p
     
    Last edited: Aug 15, 2013
  12. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,389
    2,774
    Jan 21, 2010
    The only thing wrong with the driver you have shown above is that it is a linear design and will dissipate some energy as heat.

    Assuming that the input voltage does not exceed the LED voltage by a large margin (1 or 2V is probably sufficient) then it may work out better than an LM317 style current source and that requires 3V or more above the LED voltage to operate.

    The more expensive switchmode current sources are more efficient, but as long as the voltage overhead is low, this should not be a problem.
     
  13. moeburn

    moeburn

    41
    0
    Jun 25, 2012
    Okay, so I'd rather use the LM317T I have lying around, and since my LED is 1.6v, that means I need a 4.6v or higher power source, so I'll probably use one of the many 5v 2A wall warts I have lying around.

    But I have a question: The LM317T is listed as a "1.5A regulator", but my LED can draw up to 2 amps, if it really is a 3w 1.6v LED like the product page says. Won't this break the LM317T?
     
  14. moeburn

    moeburn

    41
    0
    Jun 25, 2012
    Well I fired up the regulator, using a 3 ohm, 5 watt resistor (its what I had) between Vout and Adj, and a 5v 2a wall wart as Vin, and I got 1.2v across the resistor. I'm a little confused about it, the website I read said that adjusting this resistor value only adjusts the current limit. How do I adjust the Vout from 1.2v to 1.6v?
     
  15. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,389
    2,774
    Jan 21, 2010
    The LM317 will probably not work as you need 2A and they are rated for less than that.

    The resistor for an LM317 mist be calculated to drop 1.25V at the required current. In your case that is 1.25/2 = 0.62 ohms.

    You don't need to concern yourself about the output voltage. You are now regulating current.

    A 5V 2A power supply has the correct voltage, but the current rating is right on the edge. Perhaps you can choose a resistor with a resistance a little higher than 1.6 ohms to reduce the current slightly. If you use a 0.75 ohm resistor (two 1.5 ohm resistors in parallel) then the current will be 1.7A and your power supply (and your LED) will be happier.

    The resistors need to be rated at about 5W.

    The 3 ohm resistor would have given you a current of just over 400mA which might be good for testing.

    Oh, do you have the LED on a heatsink?

    If you were using the other (2 transistor) circuit, the resistor value required would be about half what you need here. 0.33 ohms would give you about 2A.
     
  16. moeburn

    moeburn

    41
    0
    Jun 25, 2012
    What? But I thought you just said the LM317 would work? And why don't I need to worry about output voltage, if the LED specifically states it works with 1.5-1.7v?

    I don't mind if I can only get 1.5A instead of 2A, heck I don't mind if I only get 0.5A, its a lot better than the 50ma I was getting before. Maybe I could use a high-wattage pot as the current regulating resistor, to adjust the brightness of the LED to just under when it starts to get hot?
     
  17. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,389
    2,774
    Jan 21, 2010
    I said the 2 transistor circuit would work BETTER than the LM317 style circuit.

    I assumed that you would be using the option that you suggested and not the one I said was not as good...

    As for the voltage vs current thing, you know the answer since you've already told me you have read the LED tutorial. Read and understand section 0

    FFS, don't use a high wattage pot. It's almost like you're competing to find the worst possible solution.

     
  18. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,389
    2,774
    Jan 21, 2010
    Forgive me, I thought that when I suggested you read the LED tutorial that you told me that you had, but on re-reading your reply I realise that you decided (sight unseen) it wasn't going to help you.

    Had you followed my advice you would have noted a section specifically about high power LEDs.

    https://www.electronicspoint.com/got-question-driving-leds-t256849.html
     
  19. moeburn

    moeburn

    41
    0
    Jun 25, 2012
    Hey man, if all you know how to do is be a jerk and say "go read this, someone else already typed it", then don't answer at all.
     
  20. BobK

    BobK

    7,682
    1,686
    Jan 5, 2010
    And if this is how you respond to Steve's excellent advice, you should not bother to ask.

    You do not have a basic understanding of current and voltage in LEDs. Steve pointed you to an article that covers this, taking many pages of text. Do you expect Steve to type in all of that text again just for you?

    Bob
     
    Last edited: Aug 16, 2013
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-