Maker Pro
Maker Pro

Which method is easier and preferable on electronics?

Mustwin351

Apr 10, 2013
49
Joined
Apr 10, 2013
Messages
49
I was wanting to run a small 12v load off a 12v dc 5 amp adapter with a timer.

Is it preferable to leave the power supply plugged in constantly and have a 12v timer switch the load or to use a 120v timer to switch the power supply on and off which in turn would power the load.

The load being a few LEDs being switched on and off twice daily.

I was considering just switching the load because it would be easier on the power supply.

My goat is to increase the longevity of my project.
 

shumifan50

Jan 16, 2014
579
Joined
Jan 16, 2014
Messages
579
Considering the cost of power supplies vs energy costs, it would seem a waste of energy to just switch the load. My instinct would be to switch the input to the power supply(if , as I understand from your post, you are using an external separate timer from your led circuit).
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
Agreed.

It seems there is nothing in either the power supply or the load that would warrant switching the load rather than the supply (as a counter example, I would recommend switching the load if you were turning the LED on and off once per second).

Mains timers can be had very cheaply. However they do consume some power, and it may be a reasonable question to ask as to whether you would have a net power saving.

The power supply is subject to transients while it is powered on and these can cause both cumulative stress and the risk of sudden failure (if the transient is large enough) keeping the power supply connected to the mains for half a day essentially doubles its life -- more accurately it statistically doubles the time between these sorts of failures.

You could add your own power line filtering, and that may largely eliminate this, however if the power supply gets warm whilst connected to the mains, you also have to consider the life of the capacitors which will be reduced simply by being at a higher temperature.

There are even more considerations, and it's very hard to say which way the decision would fall.

Your goat may be achieved using various methods, but it's hard to say whether you'll get maximum possible life or just good life.

More considerations that will contribute to an extended life:

1) is your power supply well overrated for the application (rated, say, for 200% of the load)?
2) does the power supply have inrush current limiting?
3) Are you employing a constant current driver for the LEDs?
4) are the LEDs well heatsinked?
5) ar the LEDs operating well within their specified current limits (say no more than 50%)
6) does the power supply have good input filtering
7) does the power supply have overrated (or generously specced) components?

I recall that someone posted an image of a power supply here a day or so ago. It had place on the board for input protection -- the silk screen showes that inductors were to be placed in one position. However, during manufacture, these were replaced with simple links. A quality power supply will not be a cheap one. In the end, the longevity is significantly determined by the quality of the design, engineering, and manufacture of parts that you're probably going to buy off the shelf, and where you're unlikely to be able to exert any control.
 
Top