Connect with us

How to make LED plug into wall AC socket?

Discussion in 'Electronic Basics' started by Me, Jul 12, 2004.

Scroll to continue with content
  1. Me

    Me Guest

    Hello,
    I have a number of blue LEDs, that work when connected to a resistor
    and a 9V battery (don't know much else about them!)

    How would I go about making a wall socket to I could plug in a group
    of LEDs?

    If I got an old 6V or 9V AC/DC adapter from say an old walkman or
    something, Could I safely connect that somehow to the LEDs?

    Should I wire the LEDs in parallel?

    Where can I find info to do this?
    THANKS
     
  2. http://www.misty.com/~don/ledd.html

    among other places.

    I recommend against paraleling LEDs.

    With a 6-volt DC "wall wart", I would assume 7 volts with a light load.

    Blue LEDs usually have a voltage drop near 3.5 volts and usually want 20
    mA typical, 30 mA max, and I would recommend somewhat less (15 mA) if you
    want to get the usually-advertised "100,000 hours" and you do not know
    that the thermal situation is better than that in the LED manufacturer's
    test lab. But you can probably get away with a little over 30 mA.

    Back to the 7-volt "6-volt DC wall wart": Put each LED in series
    with a dropping resistor, and then put all LED-resistor "strings"
    in parallel with each other. Subtract the 3.5V LED voltage from the 7V
    supply voltage, and that leaves 3.5 volts across the dropping resistor.
    3.5 volts (resistor voltage, that is) divided by .02 amp 20 milliamps)
    yields 175 ohms, and the nearest common value is 180 ohms. I recommend
    220 ohms to play safe if you want a good expectation of really long life.
    You will probably get away with 150 ohms and maybe with 100 ohms.

    Do not try matching supply voltage to LED voltage. The current through
    the LED will be of unreliable magnitude. The current through an LED as
    voltage across it varies greatly with small changes in voltage, and the
    voltage required to push a given amount of current through an LED varies
    enough with temperature and manufacturing tolerances to make current as a
    function of voltage unreliable.

    If you have a 12-volt DC "wall wart", assume 13.5 volts for a light
    load. You can put two LEDs and a resistor in series, and a few of these
    "strings" in parallel.

    With two 3.5 volt LEDs subtracted from 13.5 volts, you get 6.5 volts
    across the dropping resistor. Divide by .02 amp (20 milliamps), and this
    gives 325 ohms for a dropping resistor. The nearest common value is 330
    ohms. I would use 470 ohms if I needed to count on a few 10,000's of
    hours of LED life expectancy, although I forsee probably no quick failure
    with as low as 220 ohms.

    As for resistor wattage: Multiply 6.5 volts resistor voltage by .02
    amp, and that means .13 watt. That sounds like a 1/4 watt resistor is
    enough, but I recommend a half watt one if you want good reliability.
    Resistor reliability decreases enough at over half rated power for
    military 1/4 watt resistors to resemble commercial grade 1/2 watt ones,
    and that 12 volt DC wall wart might just provide 16 volts to a really
    light load.

    Although you can probably put 3 blue LEDs and a resistor in series with
    each other and run this from a 12 volt DC "wall wart", the LED current can
    easily be greatly different from what you expect and may vary excessively
    with temperature and presence of other loads on the "wall wart".

    - Don Klipstein ()
     
  3. Tom Biasi

    Tom Biasi Guest

    Hi,
    Put the LED's in series; add the voltage drop of each LED. Subtract the
    total from your DC supply.
    Use the difference as the V in R=V/I. Put the rated value for the LED's in
    for ' I '.
    This is the series resistor value that you need.
    Take that same difference voltage and multiply it times ' I ' and double it
    for safety and that's the wattage rating you need.
    Use the closest standard value that doesn't go under.
    Good Luck,
    Tom
     
  4. andy

    andy Guest

    The way i would do it is to set up a test rig for one of the LEDs to work
    out what voltage/current gives a good brightness. most normal leds expect
    about 20mA - more for high brightness ones. This could just be like this:

    9vDC-----------------
    |
    |
    LED
    |
    |
    1 k ohm Variable resistor
    |
    50-100 ohm fixed resistor (so you don't short it)
    |
    0V--------------------

    Set the resistor to 1kohm, then slowly reduce it until the LED is at a
    good brightness. When it's right, measure the voltage across the LED, and
    the current through it, with a multimeter. You'll have to connect the
    meter in series to measure current.

    Once you've found a good working voltage, see how many of them fit into 9V
    with a bit left over. (probably 2 for blue leds)

    Then subtract the working voltages for however many leds from 9V, divide
    this by the working current, to give the value for a resistor to put in
    each series string.

    Then wire them in series/parallel with a resistor in each series bit.

    e.g. if the working voltage/current was 4V/20mA, then you would say 4+4=8,
    leaving 1 volt over. Divide this by 20mA to get 50 ohms.

    Then you would wire them like this:

    9VDC --------------------------------------------------
    | | | | |
    LED LED LED LED LED
    | | | | |
    LED LED LED LED LED
    | | | | |
    50ohm 50ohm 50ohm 50ohm 50ohm
    | | | | |
    0VDC --------------------------------------------------

    make sure you don't go over the current rating of the power supply.

    you might be able to get away with wiring them up without any resistors,
    but this would make the current more sensitive to changes in the supply
    voltage, and you could blow the LEDs.

    You could also work out the combined parallel resistance of the resistors,
    (R/number), and use a single high power resistor instead. this might not
    compensate so well for variations in each led though.
     
  5. Me

    Me Guest

    Hello,

    I finally got around to experimenting...

    For a Blue LED, with a 1000 ohm resistor,
    Off a 9V battery,
    the voltage was 9.57 without anything connected,
    9.48 with the LED/resistor on the battery.

    How many LEDs can I hook up to the following adapters I dug up:

    12VAC, 1A
    3V DV 500mA
    9VAC 650mA

    Does it matter if I use AC or DC (since it's an LED?)

    Thanks
     
  6. LEDs do not last long if more than a little reverse voltage is
    applied, especially blue LEDs. So DC is simpler to use than AC. But
    you need about 4 volts DC minimum to light blue LEDS and have enough
    extra voltage to waste a little to control the current. So that 3
    volt DC supply is probably not going to work very well, )though it may
    put out 4 volts or so under light load). I would add a bridge
    rectifier to the 12 volt unit to convert it to DC and then connect in
    parallel several strings of 3 LEDs (anode of one to cathode of the
    next, etc.), each string having one 1000 ohm resistor to limit the
    current. If you have a multimeter, you can measure the current
    passing through this string (by putting the millimeter in series with
    the string) and adjust the value of the resistor till the current
    rises to around 10 milliamps. Then you can connect lots of similar
    strings across the supply (up to 100) before the total current exceeds
    the 1 amp rating of the supply.
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-