Discussion in 'General Electronics Discussion' started by h2k, Nov 2, 2013.

1. ### h2k

30
0
Aug 25, 2013
With respect to a voltage regulator, such as shown here, I am confused about a few things:

http://www.amazon.com/gp/product/B00BYTEHQO/ref=oh_details_o01_s00_i00?ie=UTF8&psc=1

1. If the power sources is a battery, with the output voltage from the regulator be constant, even as the power is discharged (and the input voltage drops)?

2. The do "waste" power, correct? In my case, I want a 6V output, but would like to use 8 (1.2V) batteries to make it last longer. Could I just use the regulator to get to 6V and still utilize the full power from the batteries (and not as wasted power...or with little wasted power)?

3. Conversely, can I also use a step up regulator, with a single 1.2V, to get to 6V? That would be handy if weight/size is important.

2. ### Harald KappModeratorModerator

11,141
2,540
Nov 17, 2011
1) Yes, that's the purpose of a regulator. As long you observe the limits of the regulator. In particular the data on that page states "continuously adjustable, the input voltage must be 1V higher than the output voltage".

2) Yes, every regulator "wastes" power. The effect is called efficiency. Efficiency is the ratio of output power to input power. Obviously thsi number can range from 0...1. A good switch mode regulator will be on the order of 90% efficient, meaning that 10% of input power are lost within the regulator. A switch mode regulator as this one is typically an efficient way to regulate an output voltage. A linear regulator would be much less efficient. The details depend on the design of the regulator, the difference between input and output voltage and the current.
The regulator you linked to has a built-in voltmeter with an LED display. This will use some power so the efficiency of the module will be reduced. If you don't need the voltmeter, find another regulator without. This will increase battery life.

3) Yes you can. Note that the input current scales inversely with the ratio of output voltage to input voltage, since the power drawn at the output cannot be higher than the power supplied at the input. Example: If the output is 6V*1A, the input is at least 1.2V*5A. Taking into account the efficiency, that number rises to 1.2V*(5A/efficiency) which is a bigger number since the efficiency is <1.

3. ### h2k

30
0
Aug 25, 2013
So what happens when the voltage from the battery drops to the point where it can not longer meet the specified voltage? Does it just shut off? That would be the best thing for me. Or just it continue to use the battery, but just reduce the voltage?

Also, I noticed when using this that if I set the output to 12V and then powered on the unit, the supplied voltage dropped to about 11.5V. Powering the unit off increased the output voltage to 12V again. So on load it changes...what is best practice for this?

4. ### (*steve*)¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥdModerator

25,480
2,826
Jan 21, 2010
No, typically it just loses regulation and the output voltage starts to fall.

The problem with the load being shut off is that as this happens, the load is reduced on the battery, the voltage rises and the load is powered again. Then the battery voltage falls, the supply switches off, the voltage rises again, ... and this goes on for some time.

There are ways around that, but they need to be tuned somewhat for your application.

I think you're seeing the practical effect of the load current influencing the minimum voltage across the regulator.

If increasing the input voltage by 0.5V fixes this, then that's the minimum you'll need to do.