# Power Supplies - Common Terms and Limitations

What is a power supply (source) and how can I control the voltage and current?

1. I've seen this question come up a large number of times.
"I have a 12V 2000mA Power supply, but the device I have only needs 12V 350mA"
-or-
"How do I control the Current and Voltage going to my device?"

In this resource, I will attempt to answer the majority of these questions before they need to be asked. I'll also cover a couple terms such as Constant Voltage and Constant Current, and a break down of a battery to help you choose an ideal power source for your project.

Constant Voltage
This is how the vast majority of Power Sources operate. The plug in the wall... it supplies a known voltage. The Adaptor for your cell-phone? Same thing. The goal here is to always keep the voltage the same. The amount of Current being drawn will vary based on the device, and the vast majority of devices will only take as much current as they need and do not need to be limited or regulated in this way. Just make sure that the voltage you give them is the same or close enough to the voltage they are designed for and it will operate happily.

Constant Current
This is a little different, and it will seem to operate opposite to the Constant Voltage type of source. The goal of these device is to always put out a known current. The most common example of these will be LED drivers. A constant current source will vary it's output voltage in order to try to keep the current the same.

Power Supply Ratings
Now... here is where things may get a little more complicated. If you have a 12V 500mA power source, is it a Constant Current (CC) or Constant Voltage (CV) type of supply? If it does not say on the case, chances are it will be a constant voltage adaptor that will always try to put out 12V. The 500mA rating is merely it's limit. This power source can put out 0mA up to 500mA and if you attempt to draw any more than 500mA the power source's voltage may droop, or the power source may shut-off or fault!
Likewise, a 32V 350mA Constant Current (CC) power source will always put out 350mA but can only put out a maximum of 32V to maintain this. If you attempt to exceed 32V, the power source's current may droop or it may shut-down or fault just like the other supply.

Well, there are power supplies that allow you to adjust Voltage AND Current... but this does not function as you would assume. These power sources operate in only one of the two methods mentioned above: CC, or CV. A power supply cannot operate in both modes at the same time. These power supplies allow you to 'set' one, and 'limit' the other, but you need to be sure the Power Supply you have can be configured for operation of one or the other. Many simply operate as a Constant Voltage source and only offer a limit adjustment for Current. Like the other power sources, if you reach or exceed your limit, the power supply may cause the voltage or current to droop either gradually or suddenly.

Why can't I control both?
Starting with some theory, 'Voltage = Current * Resistance'
This equation has 3 values; if you know two, you can find the third.
This also means that if you have already set two, you cannot control the third! You may think this is easy, you can just control the Voltage and Current and the Resistance will be whatever it needs to be. Unfortunately, resistance is the property of the device or circuit you are using and cannot be freely controlled. The most you can do here is add or remove components to adjust the overall resistance of the device if you were dead-set on making this alteration.*

Can I adjust the resistance of my device?
While this can be done with 'simple' devices by adding a resistor, many devices operate dynamically. Motors for example, or active circuits that actually do something will change how much current they draw. Their 'resistance' will vary...
If you put a resistor in-line with a device and expect a certain voltage to arrive at the device, you will be disappointed to learn that if the device pulls more current, the expected voltage will droop, and this drop in voltage can be enough to cause the device to misbehave. Additionally, if the device pulls less current, the voltage will go up! This can actually damage the device!

I can settle for varying Voltage. How do I use a resistor?
You don't. Advanced users may. But if you have to ask how, you are not ready.
The power that a resistor will need to handle can be quite high. Consider dropping 12V to 5V at a 1000mA draw. (Common for cellular phone charging).
The resistor will get HOT! It will also require a resistor capable of handling more than 7Watts! Some of the most common resistors are rated at about 1/4W. This would be the kind you are familiar with. Use one of these and expect smoke and/or fire. To make matters worse.. the voltage would be horribly unstable.. not just a little bit. If the device stopped drawing 1000mA and only took 500mA, you would shove 9V into the poor thing.

What about a potentiometer?
I've seen this brought up many times. Although in theory this works quite well as you can make adjustments on-the-fly, there is two very unfortunate things standing in your way:
1 - You are not fast enough to compensate using a potentiometer to make sure the power supplied to the device does not vary large enough to cause damage.
2 - Energy is flowing through the potentiometer... it will actually need to handle some power, and most potentiometers are not built for handling any kind of power. They are meant to handle 'signals'. I've actually witnessed a class-mate light one on fire by wiring it incorrectly on a bread-board during a class practice...

How do I change it then?
You can rely on 'voltage' regulators for this. If your supply is too low, or too high, a regulator can make the required adjustments. The two most common regulators you will find are 'linear' and 'switch-mode' regulators. While it's possible to build your own, it's often times much easier and cost effective to simply buy one.
Linear regulators are often 3-pin devices that are incredibly easy to use, and cheap, but they can be very inefficient and cannot 'boost' your values higher than what is provided.
Switch-mode regulators are a little more complex, but are still very easy to use and relatively cheap. They are also much more efficient, and have the ability to operate as a 'boost' regulator allowing you to use a higher voltage than what was actually supplied. These are almost always used in USB power banks to bump the 3.7V battery up to 5V for use with USB.
*Warning though. These regulators are not magic! Linear regulators will simply waste the additional energy as heat. Switch-mode regulators will increase one stat (Voltage *or* current) and decrease the other relatively proportionally.
Regulators are incredibly important... Now, for most of your projects, there is a built-in regulator in the power supply... but sometimes you need to add your own. If you don't know, or can't count on the power supply to be a steady voltage, then you should slightly oversize the supply and add your own regulator.

Ok, so how do I pick a good supply?
There are a couple things to look out for when picking a supply. And I'll try my best to point them out here.
Batteries
First, we can look at batteries. The best way to understand a battery is breaking it down into two parts. It's a constant voltage source with a resistor in series with it. The resistor represents it's internal resistance. This means that the more current a battery puts out, the lower it's over-all voltage drops because of this internal resistance. Some batteries are designed with a very low internal resistance and can pack a wallop! Car batteries for example... Of course, there are also batteries with higher internal resistances like button cell batteries. When choosing a battery, you should keep in mind the amount of current you want to draw and size the batteries accordingly... Of course, you can always add more batteries in parallel to compensate.
AC-DC Adaptors (Commonly Wall-warts)
Second, let's look at those 'wall-wart' AC-DC adaptors. There are two kinds and choosing the wrong one can cause some unexpected results. The heavy ones usually contain a transformer and some very simple filter components without regulation. They typically operate at the advertised voltage when they are being run close to their current limit. When you draw less current from these types of adaptors, the voltage will be higher! When choosing an adaptor, it's always a good idea to slightly over-size the current limit based on what your device draws. Some of these units do contain voltage regulation, so it's always in your best interest to measure the output with a multi-meter before using in a project. (When there is no device connected to the power supply, measure the voltage. If the voltage is as expected, it has voltage regulation. If the voltage is higher than expected, it does not)
The other type is smaller and lighter and is a 'switch-mode' supply. Most cell phone chargers are this type, and they contain voltage regulators and will operate as a constant voltage source and the output will hardly vary at all while they operate.*
* Some switch-mode supplies try to be smart by shutting off if they think they aren't being used... so if you don't pull enough current from them, they either won't turn fully on, or will only run for a short while after being powered on.

With that, I hope some common questions have been covered. I'll be making slight adjustments as I come across more questions and concerns about power supplies.
Additional Details, terms and ideas can be found in the resource discussion.
SONAR, SOLARWIND, bushtech and 8 others like this.
Continue to site