# Two voltage regulators in a curcuit?

Discussion in 'Electronic Basics' started by chris, May 22, 2004.

1. ### chrisGuest

I have a 12v regulated power supply (wall adapter) that feeds some
devices. Can I then put a voltage regulator feeding off of the
regulated wall adapter to step down the voltage to 2.4v and expect it
to work properly?

Background: I have a 12v camera and 12v microphone (baby cam) that
few IR LEDs and power them off of the wall adapter. I was considering
regulating the power down to 2.4v for the LEDs using a LM317T but
perhaps I'm making too much of this. Should I just use a resister to
drop the voltage to the LEDs since the wall adapter is already
regulated?

2. ### Byron A JeffGuest

-I have a 12v regulated power supply (wall adapter) that feeds some
-devices. Can I then put a voltage regulator feeding off of the
-regulated wall adapter to step down the voltage to 2.4v and expect it
-to work properly?

Probably. The bigger issue is the amount of heat generated by dropping
nearly 10V across the regulator. How much current are we talking about?

-
-Background: I have a 12v camera and 12v microphone (baby cam) that
-few IR LEDs and power them off of the wall adapter.

Oh. LEDs. That's a different story.

- I was considering
-regulating the power down to 2.4v for the LEDs using a LM317T but
-perhaps I'm making too much of this.

Yes.

- Should I just use a resister to
-drop the voltage to the LEDs since the wall adapter is already
-regulated?

Definitely. Since it's going to be a few LEDs I would suggest stringing them
in series. IR LEDS seem to have voltage drops between 1.5V and 1.9V. So I'd
probably string about 5 in a row and use a resistor to drop the remaining
voltage and limit the current. I'd suggest being conservative with both the
Vf and current ratings of the LEDs. So use a resistor that's the next step
up from your computed value. So let's say you have 6 1.6V Vf LEDs with a
continuous current rating of 20ma. The 6 LEDs will drop 9.6V with 2.4V
reamining. So R=V/I -> 2.4/.020 -> 120 ohms. Id be conservative and use a 150
giving current of 2.4V/150ohms -> 16 mA.

Hope this helps,

BAJ

3. ### Rheilly PhoullGuest

If there is to be several LED's mebbe you should wire them in series with
the appropriate resistor ??

4. ### John PopelishGuest

There is certainly that possibility, assuming you do it correctly.
(hint: read the regulator data sheet for all the details).
LEDs are current operated devices. that is, they are specified to
produce a certain amount of light when a specified current is passes
through them. This process requires some voltage to be dropped across
them, but this is not precisely specified. If you apply a stiff (well
regulated) volt to an LED, and it is just a little too high, the
current will be way lower than you expect and very little light will
be produced. If the voltage is just a bit too high, the current will
be way higher than you expect and the LED will probably go into
thermal run away (as the temperature rises, the current will keep
going up till the device is destroyed). This difficulty with stiff
voltage sources is the reason most LEDs are driven with a series
resistor. It helps to control variations in current.

I would probably wire as many LEDs in series as it took to use up
about 60 to 80% of the available voltage and calculate a resistor to
use up the rest. For instance, if the LEDs have a typical voltage
drop of 1.8 volts at a rated current
of 20 milliamps than 4 of them in series would drop about 7.2 volts
(60% of 12 volts). A 240 ohm resistor would use up the rest of 12
volts (4.8 volts) with 20 milliamps passing through it. This way you
light 4 LEDs with 20 ma from the 12 volt source. If you wired 5 LEDs
in series, they would need about 9 volts and a 150 ohm resistor in
series to waste the remaining 3 volts while passing 20 milliamps. The
current from the supply would still be 20 milliamps but would vary a
bit more if the LEDs did not drop exactly 1.8 volts as expected.

5. ### chrisGuest

Thanks guys, that helps a lot.