# What is the real voltage from an AC/DC adaptor ?

Discussion in 'Electronic Basics' started by trudeu, Dec 20, 2004.

1. ### trudeuGuest

I often find that the voltage listed on the Adaptor is not the voltage that
it gives out. eg it might say 4.5 but when I put the volt meter on it shows
as 5.8 or something not under load.

It is my understanding that the real voltage that it gives out can only be

My question is, is there a way to test adaptors to see the real voltage they
give. While I'm sure they are relatively resilient to breaking I have
several that are old and I want to test them to see what the real voltage is
that they produce. Some even have their markings missing.

Thanks

2. ### Guest

At the risk of sounding simplistic, why not just measure the voltage
while the adaptor is under a load? You can use either an actual
load that you intend to use the adaptor for, or a resistor to simulate
the load (provided it's power rating is high enough).

-- Mark

3. ### Dominic-Luc WebbGuest

Perhaps there is an "intended" load that will pull it down to the
indicated load? They are usually a little higher than marked. However,
I have encountered some that were rated 9 VDC and they do not pull
down to even 12 VDC with the maximum power consumption according to the
Watts labelled on the transformer. In this case, it is gives 14 volts
without a load. I have wondered since long ago if there exists a conspiracy
in which electronics manufacturers mislable adapters and tell you that you
will either not give sufficient voltage or damage the device with overvoltage.
They even tell you in the instruction manual that the device only works
with their particular adapter. Current examples are most digital cameras.

One observation I have had is that different meters show different values.
The DC output of these adapters is presumably achieved by a rectifier, and
I have seen considerable variability between different rectifiers attached
to the same transformer. In this case, I would expect the output frequency
to reflect the 50-60 Hz AC input. Although, I am suspicious that there may
be some units that half rectify while others give full rectification. My
meters that I have used to measure voltage from these differ markedly. One
of my older analog meters do not accurately read rectified DC. In some
cases, it seems to respond to a sharp spike by indicating a higher voltage
(some kind of overshoot?) and in other (most cases) it reads lower.
They clearly do not show the same as a continuous current like a
battery. Now that I finally have an oscilloscope, I finally get to see
these aberrations. I have seen considerably different on-off times (duty cycle)
for the pulses in some cases, for instance.

I'll let the talented engineers in here explain all this.

Dominic

4. ### trudeuGuest

Will that work? Being that Im asking this question I'm a simplistic guy. Or

So what your saying as long as I put a load that has the right power rating,
I will get the consistent "real" voltage coming from the Adaptor.

Just so I understand you,
e.g.. If I have two devices that draw slightly different Voltages say 5.5
250mA and 6 300ma. (Assuming that the 5.5 device is rugged enough to take a
little larger voltage/amperage, many are I think) And say you use an AC/DC
adaptor that is rated as 6v DC and 300mA. Would the voltage if i tested it
where it goes into the device be the same when you test it with a volt meter
for both devices. (Hopefully what the AC/DC adaptor is rated at ?)

Thanks

5. ### Robert MonsenGuest

In addition to the voltage rating, there is usually a current rating
(Something like 300mA, for example). The voltage on the output is
generally guaranteed to be above the rated voltage if you are drawing
less than that amount of current out of the thing.

So, you can test them by doing the following:

1) Assume you have one with a voltage rating V and a current rating I.
2) Compute the resistance required to draw that current from that
voltage, as R = V/I
3) Get a resistor with at least twice the power rating equal to V*I
4) Connect the + and - input using the resistor.
5) If the voltage is less than the rated voltage, your adapter is hosed.

If you can't find resistors that are big enough, you can parallel
resistors. If you parallel N resistors, the power rating will go up by a
factor of N, and the resistance of the entire set will go down by a
factor of N.

Thus, say you have a 5V 300mA adapter you want to test. You need a
resistor of 16.666 ohms that has a power rating of at least 3W. You can
make this out of 1/4W resistors by using 12 of them in parallel. The
values need to be 16.666 * 12 = 200 ohms.

Parallel means connect the resistors like a ladder, with the rungs being
the resistors, and wires being the sides of the ladder. The sides will
be where you cnnect the +- of the adapter, and measure it.

--
Regards,
Robert Monsen

"Your Highness, I have no need of this hypothesis."
- Pierre Laplace (1749-1827), to Napoleon,
on why his works on celestial mechanics make no mention of God.

6. ### trudeuGuest

This is exactly why I asked the question. I sort of understand what you are
It seems however to me as a lay person that AC/DC adaptors often give off
differnent values than they are suppose to. Which is why IM trying to see
how I can test them. The only reason I think they are somewhat sloppy is
because most devices can take the hit of a volt or 2 without any ill effect.
Short term anyway.

7. ### tempus fugitGuest

I have noticed this phenomon myself. Part of the problem is that most (i.e.,
almost all) wall warts are unregulated, so the output varies with the load.
If the adaptor were regulated, the voltage would be consistent regardless of
That being said, to find the actual output (which will vary based on load)
put a resistor across the + and - points of the plug, and see what the
output is, as per the other poster's advice.

8. ### Guest

If I have two devices that draw slightly different Voltages say 5.5
Okay, let me back up a little. This is somewhat lengthy,
so bear with me.

For the AC/DC adapter, (or any DC power supply), one
could make a graph of the output voltage vs. the current
while it is operating, for different load conditions.
The load could be a simple resistor, which is changed from
zero ohms, to several non-zero values, to infinite ohms (open
circuit). The graph will trace out current and voltage
outputs from that supply, and might look something like you'll
see at the following URL:

See Figure 8 on p. 6 of:
http://64.224.241.100/products/data/pdf/cmpwr330.pdf

If a power supply is rated for 6 V and 300 mA, that refers to
just one possible operating point on the voltage-current curve.
As Robert said, it usually means that you'll get at least 6 V
whenever the current is 300 mA or less.

Similarly, for a device that is to be powered by the DC supply,
one could make another graph of voltage vs. current. If the
device is a simple resistor, this graph will be a straight line,
given by the well-known equation V = I R. This will intersect
with the current-voltage curve of the power supply **somewhere**.
That intersection point gives the voltage and current when this
particular device is connected to this particular power supply.

In other words:
The device **must** operate somewhere on it's current-voltage
curve, and the power supply must operate somewhere on **it's**
current-voltage curve. The intersection point of the two
curves represents the only current and voltage that makes both
of those statements true.

different currents at different voltages:

First, it is possible that the same device could both draw
250 mA when powered at 5.5 V, **and** draw 300 mA when it is
powered by 6 V. Or, a device that draws 250 mA at 5.5 V might
draw something other than 300 mA at 6 V. So, it is impossible
to say, from the information you give, whether the two devices
would give the same voltage reading when driven by that power
supply.
Yes. However, there is not a single unique "real voltage" that
it gives out no matter what the load resistance is. For dif-
ferent loads, it would be different voltages (in general).
Hope this helps.

-- Mark

9. ### Dominic-Luc WebbGuest

I checked a couple of my adapters last night. Amazing how much variance
there is. I should first say that most actually gave a very continuous
voltage (as seen on oscilloscope), as compared to square or other. This
is not always the case. One gave two different high states that was
sqaure. Another showed a zero and high voltage that was roughly square.

One continuous DC output adapter showed 16 V without load. It was
rated 9 VDC @ 200 mA. Applying a circuit known to draw at least
double this load failed to lower the voltage below 12 volts. Needless
to say, I never add this to 9 VDC circuits that would be damaged by
overvoltages. It also illustrates the need to check adapters before
connecting them.

Dominic

10. ### trudeuGuest

So if I may sumerize what I think you are saying?

If I used a meter to test the voltage the DC adaptor under load, it would
vary slightly based on the device I have pluged into it. The voltage drawn
is based on the mA and Voltage drawn by the device. Further its really hard
to find the exact voltage the adaptor produces because of this. To some
degree the adaptor produces what the device needs as long as it stays within
the certain limits.

However you can test an adaptor roughly as to weather is is producing what
it says it should by using Roberts formula.

I have to say I dont understand the chart. Hopefully its not too important
or beyond me.

11. ### ClarenceGuest

Perhaps it should be noted that "trudeu" seems to have failed the course on
basic Principals of power supply design in school. He states, or miss states,
very basic observations which are clearly described in any good text book on
the subject.

Yes I know this is ALT.BASIC ELECTRONICS, however it is painful to read some of
the drivel.

SO if you are using a simple power supply, (like a wall wort,) 1. Design Your
circuit to survive the highest voltage the wall wort can supply (plus a safety
margin) and 2. Be sure it can supply the current required.

Else: provide protection for the maximum voltage expected, and a local
regulator.

12. ### John FieldsGuest

^^^^^^^^^^ /
principles mis-states-----

13. ### Guest

would
vary slightly based on the device I have pluged into it. The voltage
drawn
is based on the mA and Voltage drawn by the device. Further its really
hard
to find the exact voltage the adaptor produces because of this. To some
degree the adaptor produces what the device needs as long as it stays
within
the certain limits.
what
it says it should by using Roberts formula.

Your pretty much have it. I reread Robert's post, and for your
rating of the resistor is important, do not ignore that part of what he
said.

I had tried to find a simpler voltage-current curve somewhere on the
internet, but could only find the one I mentioned in my earlier post.
If the concept is still unclear:
2. A book would be a better means of communicating the concept of
current-voltage curves; it is difficult in this group since we are
restricted to text-only, and it takes a lot of time to come up with a
clear presentation of the subject.

Good luck,

Mark

14. ### John FieldsGuest

---
Here's a link to a bunch of current VS voltage curves for a bunch of
different wall-warts.

Click on an entry in the Part# column to get the data sheet for that
supply, complete with curves _and_ ripple spec's.

15. ### John FieldsGuest

---
US mains are usually specified to be 120VRMS nominal, +/- 10%, which
means that the input voltage to the wall-wart can vary anywhere
between 108V and a 132V, with a corresponding change in the output
voltage. The three curves close to each other on the data sheet show
the output voltage with different output currents, plotted against
three different inputs, 132VRMS, 120VRMS, aned 108VRMS. To read the
chart, find the curve which corresponds to the mains voltage you're
interested in, then find the current you need from the X axis and draw
a line from there straight up to the curve you've chosen. Next, draw
a straight line from the intersection of the X axis and the curve to
the Output Voltage scale on the Y axis. Where it crosses the scale is
what the output voltage will be for that current and that mains
voltage. Doing the same thing for different mains voltages and output
currents, you can see that the output voltage will be directly
proprtional to the mains voltage and inversely proportional to the
current out of the wall-wart.
---

16. ### trudeuGuest

Thanks I looked at the Tamuracorp and it is a little clearer. My reading of
those sheets makes me think of yet another variable. What is the house
voltage? I mean it can I think go all the way from 115 - to 125. That throws
even more variables in. So admitedly any device needs to have some tolerace.

I practiced the math of Roberts procedure to make sure I understood it. Its
good that he gave an example to help. Yes I understand the importance of
wattage when testing.

I have read a few beginners books on electricity and electronics. I will
continue to read. Im not trying to build anything, I'll leave that up to
others. Im just trying to understanding the basics to help grasp simple
problems around the house.

Regards
Thanks.

17. ### D AkersGuest

"I have read a few beginners books on electricity and electronics. I
will continue to read. Im not trying to build anything, I'll leave that
up to others. Im just trying to understanding the basics to help grasp
simple problems around the house."
____________________________________
Re;
The key word with wall-warts is regulation and the typical wall wart
doesn't have any. Thus, most devices designed to accept them have an
internal voltage regulator with plenty of designed head room for the
over/under voltages presented by the wall-wart market. My experience is
that the labelled voltage output on the typical wall-wart is
meaningless; second only to the lack of a polarity convention for power
supply jacks and plugs; now there's a conspiracy! However, there are
wall-warts with internal voltage regulation. I know that Radio Shack
sells them as does Digi-Key; but as would be expected; they are more
expensive.

-Dan Akers