# Linear voltage regulator problem lm338

Discussion in 'General Electronics Discussion' started by BlinkingLeds, Mar 28, 2013.

180
0
Feb 23, 2013
Hi i made a constant current driver with the LM338 for an 10w led. When i first turn it on and set it to 800mA it slowly begins to climb to 1A and more without stopping. At first i thought that my multimeter was playing games so i connected a needle panel ampmeter and it showed the same. This also happens with the lm317.
Another problem with the LM338 is that it does not keeps the current constant for some reason so with a small (3V 20mA) led it shows 18mA and when i connect a 1W led (3v 300mah normal) it shows about 14mah.
The lm317 stays at 18mA whatever load i connect to it.

Last edited by a moderator: Mar 29, 2013
2. ### (*steve*)¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥdModerator

25,481
2,830
Jan 21, 2010
Show us the circuit.

3. ### Harald KappModeratorModerator

11,164
2,550
Nov 17, 2011
See figure 14 in the datasheet. If your circuit is different, show us, asSteve already requested.

Apart from that: The LM338 is a 5A regulator. Fairly oversized for use at 800mA.

180
0
Feb 23, 2013
I tried it again and it seems that the led itself is drawing more and more current as it heats up, i tried it without the voltage regulator straight off the power supply.
Shouldn't the voltage regulator prevent the LED from drawing more current that i set it to?

#### Attached Files:

• ###### 123.jpg
File size:
10.7 KB
Views:
2,424
Last edited: Mar 29, 2013

180
0
Feb 23, 2013
oh and i don't have the capacitor connected could that be the problem?

6. ### (*steve*)¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥdModerator

25,481
2,830
Jan 21, 2010
What value resistor are you using for R1?

Are you sure you have the leads to the LM338 connected around the right way?

Are you sure it's an LM338?

If the current is increasing (at all) especially as the LED heats up, then you have made a mistake rather than a constant current source.

7. ### BobK

7,682
1,688
Jan 5, 2010
Yes, a mistake is one of the easiest electronics projects to make!

Bob

180
0
Feb 23, 2013
I have a potentiometer for R1. For 800mA setting the R1 is 0.1ohms . If i set it at 400mA then it is 3ohms and it slowly increases in about 30seconds it goes to 500mA. If i cool allow the led to cool and then try again it starts from 400mA again.

The LM338 is connected as the picture. I also tried to use 100n capacitor from pin 1 to the negative terminal of the 12v power supply, same result.

Last edited: Mar 29, 2013
9. ### Harald KappModeratorModerator

11,164
2,550
Nov 17, 2011
The equation says I=Vref/R1 -> R1=Vref/I
Vref=1.25V -> R=1.25V/0.8A = 1.56 Ohm
How do you arrive at 0.1Ohm? Using 0.1Ohm would leaad to a current of 12.5A, way outside the limitation s of the LM. In that case the internal current limiting of the LM will jump into action. Depending on the input voltage and the cooling provided, this can be anywhere <=5A.

A potentiometer is not suitable for R1. R1 will dissipate P=Vref*I=1.25V*.8A=1W. A potentiometer is typically not rated for such high power levels. It will become warm and the temperature coefficient of the potentiometer will change the resistance, thus changing the current. Use a fixed resistor with sufficient power rating.

And yes, add the capacitor, it is required by the LM for stable operation.

Btw: the current is given in mA, not mAh.

Last edited: Mar 29, 2013

180
0
Feb 23, 2013
The led itself (straight of the power supply) draws about 2.5amps

OOOPS sorry mah is mA hour, gonna change everything
thanks for pointing out that silly mistake

For some reason the potentiometer doesn't gets hot at all it's ice cold. it's 1/4w

Last edited: Mar 29, 2013

180
0
Feb 23, 2013
Ok i used 1.8ohm 5w resistor and it starts with 450mA and AGAIN increases

12. ### Harald KappModeratorModerator

11,164
2,550
Nov 17, 2011
Operating an LED off a power supply without current limiting is not a good idea. Unless the LED has its own current limiter, that is.

What is your input voltage, what are the ratings of the LED?

The input voltage of the LM needs to be at least 2V higher than the LED voltage plus Vref (across R1), a total of 3.25V.

180
0
Feb 23, 2013
Ok when i say straight off the power supply i need with a series resistor , the odd thing is that it has the same behavior so the voltage regulator is not what causing the led to draw more current.

I have an 12v computer power supply and/or an 9v linear adapter same results with both.

LEDs:
10w chip white 900mA 9-12v
1w 10mm red 2.5v
same problem with both leds with or without the lm338 (the 1w led is in series with a 100ohm resistor if connected without the lm338)

180
0
Feb 23, 2013
Last edited: Mar 29, 2013
15. ### BobK

7,682
1,688
Jan 5, 2010
Well, there is your problem. If the LED is going to run at 12V, you need 14V mimimum input to the regulator. What is happening is that the regulator does not have the headroom it needs to regulate.

Bob

180
0
Feb 23, 2013
ok and how about the 1w led?? it only needs 2.5v so why is it behaves the same way?

17. ### BobK

7,682
1,688
Jan 5, 2010
Good point. I think there still must be something wrong with your circuit. The fact that you had a 0.1 Ohm resistance and it acted no different than with a 180 Ohm resistor says that something is not right.

Bob

18. ### Harald KappModeratorModerator

11,164
2,550
Nov 17, 2011
It would help if you showed us your complete circuit, not only that excerpt from the datasheet.

Last edited by a moderator: Mar 30, 2013

180
0
Feb 23, 2013
But it is the complete circuit. Except i have the led (1w or 10w) where the load is in the circuit.

20. ### BobK

7,682
1,688
Jan 5, 2010
Is this on a breadboard? How about a photo so we can verify the circuit?

Bob