Maker Pro
Maker Pro

Why is Motorola USB not Blackberry USB (chargers = 5.0v, 5.9v, 350ma, 500ma)

E

Emily

Jan 1, 1970
0
Is it true that one MUST use ONLY the EXACT USB charger that comes with
their cell phone or digital camera?

Are USB chargers really not interchangable?

The T-Mobile store told me I could only use their T-Mobile charger for my
new USB based cellphone. After showing me the charger for the Motorola V195
which is 5.9 volts 375ma, they then opened a desk drawer and handed me
three melted USB chargers from blackberrys & digital cameras, one at 5.0
volts, 750ma; another at 5.0 volts, 550 ma, and yet another at 5.2 volts
450 ma.

Can we swap these supposedly USB chargers or not?
- Blackberry TCPRIM2ULSSN 5.0vdc 750mA
- Motorola PSM5037B 5.9vdc 375mA
- Motorola DCH3-05US-0300 5.0vdc 550mA
- Motorola FMP5185B 5.2vdc 450mA

Why is Blackberry USB different than Motorola USB which is different in and
of itself? Can we swap these USB chargers or must we stick to the charger
that came with the device?

Emily
 
C

Cgiorgio

Jan 1, 1970
0
The wall charger is providing DC power to the charging circuit inside the
device which will limit the current going to the Li-ion battery as well as
the end of charge voltage (4,19 .. 4,20 Volts in most cases). A charger with
too low a current rating and no current limiting circuit can indeed be
burned out if its current rating is exceeded for an extended time. USB power
in the PC - world is 5.0 volts, and most Li-ion charger IC's will work
perfectly with 5.0 Volts. The store clerks probably do not know what they
are talking about.
 
Emily said:
Is it true that one MUST use ONLY the EXACT USB charger that comes with
their cell phone or digital camera?

You are well advised to do so. That does not mean you can never
deviate. But if you do you had better know what you are doing. You
might be safe using the same voltage at a slightly different amperage
rate, but again you are on your own if trying this.

Doug Hoffman
 
E

Emily

Jan 1, 1970
0
USB power in the PC - world is 5.0 volts

I'm very confused.

If USB power is 5.0 volts in the PC world, why it USB power 5.9 volts in
the Motorola V195 world?

Isn't USB a standard in both worlds?

Emily
 
E

Emily

Jan 1, 1970
0
You might be safe using the same voltage at a slightly different
amperage rate, but again you are on your own if trying this.

But what I don't understand is why the amperage and voltage are DIFFERENT
for the USB chargers.

Isn't USB a standard?

Why would some chargers be almost 6 volts and others be 5 volts?

Emily
 
J

Jon Slaughter

Jan 1, 1970
0
Emily said:
Is it true that one MUST use ONLY the EXACT USB charger that comes with
their cell phone or digital camera?

Are USB chargers really not interchangable?

The T-Mobile store told me I could only use their T-Mobile charger for my
new USB based cellphone. After showing me the charger for the Motorola
V195
which is 5.9 volts 375ma, they then opened a desk drawer and handed me
three melted USB chargers from blackberrys & digital cameras, one at 5.0
volts, 750ma; another at 5.0 volts, 550 ma, and yet another at 5.2 volts
450 ma.

Can we swap these supposedly USB chargers or not?
- Blackberry TCPRIM2ULSSN 5.0vdc 750mA
- Motorola PSM5037B 5.9vdc 375mA
- Motorola DCH3-05US-0300 5.0vdc 550mA
- Motorola FMP5185B 5.2vdc 450mA

Why is Blackberry USB different than Motorola USB which is different in
and
of itself? Can we swap these USB chargers or must we stick to the charger
that came with the device?


The current rating on a voltage source is the maximum amount that the power
source can deliver without exceeding its saftey rating.

What this means is that if you are using some device that has a power supply
with a current rating of 500mA then its best not to use a different power
supply(at the same votlage rating) with a lower max current rating. i.e.,
anything < 500mA. Now ofcourse you might be able to get away with it but if
it burns down your house then its your fault.

A device will only pull the amount of current that it uses(assuming it is a
voltage controlled device) and this is true regardless of the current
rating(hence the saftey issues I discussed above). If a device
says(sometimes they don't) it uses 500mA then it uses 500mA. Maybe it
doesn't use 500mA all the time but the engineers have put that rating there
for a reason. Using any power supply with the right voltage and a current
rating of anything more than what the device uses is ok because the device
will only pull the current it uses.

Now, about the voltage rating: The voltage rating does not have to be exact
and different devices can tolerate different voltage ratings. The problem
usually is one of current. By increasing the voltage, say, you increase the
current the device uses and then you have changed the parameters that the
device was created with.

i.e., suppose the device was created to work with 5VDC@1A. Now suppose you
use a power supply that is 6VDC. The device will not draw 1A but more than
one because of ohms law. It might draw so much more current that the
components will overheat or some other problems could happen. Not all
constant voltage power sources are equal... The voltage are not really
constant but the devices are usually designed to tolerate a small deviation.

So it may or may not be ok to use a different power source rated at a higher
voltage. Usually you can get away with using a lower voltage rating but when
you do this you decrease the current drawn by the device. It might not be
enough current to actually run the device. Having more than 1V over the
voltage rating is asking for trouble for most devices and even 1/2 a volt
might not be wise if you care about the device.


So, if you understand all that then your question is pretty trivial with the
answer "It depends".
Can we swap these supposedly USB chargers or not?
- Blackberry TCPRIM2ULSSN 5.0vdc 750mA
- Motorola PSM5037B 5.9vdc 375mA
- Motorola DCH3-05US-0300 5.0vdc 550mA
- Motorola FMP5185B 5.2vdc 450mA

Each of these power supplies are designed for different device requirements.
They do this mainly because its cheaper.

you could get away with just one voltage source for 3 of the above:
- Blackberry TCPRIM2ULSSN 5.0vdc 750mA
- Motorola DCH3-05US-0300 5.0vdc 550mA
- Motorola FMP5185B 5.2vdc 450mA

All could use a power supply of 5VDC@1A. This 1A rating gives enough current
to any of the devices here and the 5V should work with the FMP5185B. 0.2V
might be to much though and cause some issues.
- Motorola PSM5037B 5.9vdc 375mA

This is basically a 6V device. I doubt the 0.1V matters to much but it
could. Usually devices are not designed to be that tolerant because most
components are not very precise. You might even be able to get this thing to
work off a 5VDC source but I'm not sure. It usually won't hurt to try as
long as it can supply the maximum rated current.

i.e. for the PSM5037B you could try a 5VDC@375mA and see if the device
works. Ofcourse you could try anything above 375mA and it could work(even
slightly lower since we lowered the voltage). At most you will probably fry
your power supply(but this could fry the device in some cases).


Ok, now for the practical side. Not all power sources are created equal.
Even two power sources at the same voltage and max current ratings can be
completely different. There are many types of methods of supplying power.
Usually your typical "wall wart" is just a piece of junk with nothing
special in it. If, say, you pull to much current from it then it can melt
but in the process increase the voltage which will cause more current to be
pulled with could fry your device. (remember, increasing the voltage on the
power supply increases the actuall current used by the device which could go
over the maximum current that the power supply can safely supply.)

If you have a need for many different power supplies then you might want to
get one with selectable voltages. Here though you will need to make sure you
don't hook any devices up that would draw more current than the power supply
can give and that you always select the right voltage and polarity that the
device needs.
Can we swap these supposedly USB chargers or not?
- Blackberry TCPRIM2ULSSN 5.0vdc 750mA
- Motorola PSM5037B 5.9vdc 375mA
- Motorola DCH3-05US-0300 5.0vdc 550mA
- Motorola FMP5185B 5.2vdc 450mA

Why is Blackberry USB different than Motorola USB which is different in
and
of itself? Can we swap these USB chargers or must we stick to the charger
that came with the device?


Now heres the exact answer why these are different:.

The Blackberry draws a max of 750mA from its supply. If you use any of the
other supplies they will burn out because they cannot supply the 750mA.
(they actually will try but burn out because it would get really hot). Using
the PSM5037 on the Blackberry would be even worse than using the DCH
beacause the increase in voltage would increase the current used by the
Blackberry and surely draw to much current.

The same idea applies to the others. The TCPRIM could be used for the DCH3
and the FMP5185(probably) because it has the right voltage and gives much
more current than they need.


Whats would be really bad is to use say, a 10VDC@10A power supply on all the
devices. Here the devices would surely be destoryed and not the power
supply.

The reason is that the extra voltage would cause the device to consume more
current which would cause it to get hotter and most likely hotter than it
was designed for. Since it uses more current and because most lilely the
power supply can supply that current(since it can do 10A) it will not be
destroyed first and hence the device will eventually overheat and fail.
Specially since most cheap power supplies do not have very good safety
precautions along with devices not having fuses and such(I'm sure they do
this so the device will be ruined if used wrong and would need to be
replaced).

So while its ok to use a larger maximum current rating you have to make sure
the voltage rating is correct too so you can't make the device pull any more
current than it was designed for. e.g., a device that uses 5VDC and 500mA
will "always" use a *maximum* of 500mA at 5VDC and never any more(with
normal conditions). If you force it to use 7VDC then you also force it to
use more current... maybe 750mA. The device was not designed to use this
amount and something will fail(either the powersupply or the device).



Hope this helps,
Jon
 
E

Emily

Jan 1, 1970
0
The current rating on a voltage source is the maximum amount that the power
source can deliver without exceeding its saftey rating.

What this means is that if you are using some device that has a power supply
with a current rating of 500mA then its best not to use a different power
supply(at the same votlage rating) with a lower max current rating. i.e.,
anything < 500mA. Now ofcourse you might be able to get away with it but if
it burns down your house then its your fault.

A device will only pull the amount of current that it uses(assuming it is a
voltage controlled device) and this is true regardless of the current
rating(hence the saftey issues I discussed above). If a device
says(sometimes they don't) it uses 500mA then it uses 500mA. Maybe it
doesn't use 500mA all the time but the engineers have put that rating there
for a reason. Using any power supply with the right voltage and a current
rating of anything more than what the device uses is ok because the device
will only pull the current it uses.

Now, about the voltage rating: The voltage rating does not have to be exact
and different devices can tolerate different voltage ratings. The problem
usually is one of current. By increasing the voltage, say, you increase the
current the device uses and then you have changed the parameters that the
device was created with.

i.e., suppose the device was created to work with 5VDC@1A. Now suppose you
use a power supply that is 6VDC. The device will not draw 1A but more than
one because of ohms law. It might draw so much more current that the
components will overheat or some other problems could happen. Not all
constant voltage power sources are equal... The voltage are not really
constant but the devices are usually designed to tolerate a small deviation.

So it may or may not be ok to use a different power source rated at a higher
voltage. Usually you can get away with using a lower voltage rating but when
you do this you decrease the current drawn by the device. It might not be
enough current to actually run the device. Having more than 1V over the
voltage rating is asking for trouble for most devices and even 1/2 a volt
might not be wise if you care about the device.


So, if you understand all that then your question is pretty trivial with the
answer "It depends".


Each of these power supplies are designed for different device requirements.
They do this mainly because its cheaper.

you could get away with just one voltage source for 3 of the above:


All could use a power supply of 5VDC@1A. This 1A rating gives enough current
to any of the devices here and the 5V should work with the FMP5185B. 0.2V
might be to much though and cause some issues.


This is basically a 6V device. I doubt the 0.1V matters to much but it
could. Usually devices are not designed to be that tolerant because most
components are not very precise. You might even be able to get this thing to
work off a 5VDC source but I'm not sure. It usually won't hurt to try as
long as it can supply the maximum rated current.

i.e. for the PSM5037B you could try a 5VDC@375mA and see if the device
works. Ofcourse you could try anything above 375mA and it could work(even
slightly lower since we lowered the voltage). At most you will probably fry
your power supply(but this could fry the device in some cases).


Ok, now for the practical side. Not all power sources are created equal.
Even two power sources at the same voltage and max current ratings can be
completely different. There are many types of methods of supplying power.
Usually your typical "wall wart" is just a piece of junk with nothing
special in it. If, say, you pull to much current from it then it can melt
but in the process increase the voltage which will cause more current to be
pulled with could fry your device. (remember, increasing the voltage on the
power supply increases the actuall current used by the device which could go
over the maximum current that the power supply can safely supply.)

If you have a need for many different power supplies then you might want to
get one with selectable voltages. Here though you will need to make sure you
don't hook any devices up that would draw more current than the power supply
can give and that you always select the right voltage and polarity that the
device needs.



Now heres the exact answer why these are different:.

The Blackberry draws a max of 750mA from its supply. If you use any of the
other supplies they will burn out because they cannot supply the 750mA.
(they actually will try but burn out because it would get really hot). Using
the PSM5037 on the Blackberry would be even worse than using the DCH
beacause the increase in voltage would increase the current used by the
Blackberry and surely draw to much current.

The same idea applies to the others. The TCPRIM could be used for the DCH3
and the FMP5185(probably) because it has the right voltage and gives much
more current than they need.


Whats would be really bad is to use say, a 10VDC@10A power supply on all the
devices. Here the devices would surely be destoryed and not the power
supply.

The reason is that the extra voltage would cause the device to consume more
current which would cause it to get hotter and most likely hotter than it
was designed for. Since it uses more current and because most lilely the
power supply can supply that current(since it can do 10A) it will not be
destroyed first and hence the device will eventually overheat and fail.
Specially since most cheap power supplies do not have very good safety
precautions along with devices not having fuses and such(I'm sure they do
this so the device will be ruined if used wrong and would need to be
replaced).

So while its ok to use a larger maximum current rating you have to make sure
the voltage rating is correct too so you can't make the device pull any more
current than it was designed for. e.g., a device that uses 5VDC and 500mA
will "always" use a *maximum* of 500mA at 5VDC and never any more(with
normal conditions). If you force it to use 7VDC then you also force it to
use more current... maybe 750mA. The device was not designed to use this
amount and something will fail(either the powersupply or the device).



Hope this helps,
Jon

Hi Jon,
Thank you very much for the detailed explanation on USB power supplies.
I now that I should have paired the DEVICE requirements with the USB POWER
SUPPLY capabilities:
DEVICE = Blackberry 8700 SUPPLY = TCPRIM2ULSSN 5.0vdc 750mA
DEVICE = Motorola V195 SUPPLY = PSM5037B 5.9vdc 375mA
DEVICE = Motorola RAZR SUPPLY = DCH3-05US-0300 5.0vdc 550mA
DEVICE = Motorola Earbud SUPPLY = FMP5185B 5.2vdc 450mA

From the discussion, can I "assume" if I hook the Motorola V195's USB power
supply (5.9vdc 375mA) to the Blackberry 8700 device, that the Blackberry
will be getting more voltage than it 'expected' and that the current
delivered will be much less than expected (even more so due to the higher
voltage than expected)?

This implies that USB chargers are NOT interchangable!
(The T-Mobile store clerks just might have been right.)

But, what irks me is they all have the SAME CONNECTIONS!
They all "LOOK" the same to me!

Does EVERYONE label all their USB chargers so they don't mix them up?
Or am I missing something fundamental here.
If it says it's a USB charger, but that we can't use them interchangably,
then are they REALLY USB power supplies?

I'm still confused on the fact that the charger advertises it is USB but
it's not USB if it doesn't fit all USB devices.

Can someone clear up the USB part of the confusion here?
Emily
 
C

Cgiorgio

Jan 1, 1970
0
USB or Universal Serial Bus is a standard for high speed serial data
transfer from a computer to a peripheral and vice versa. All your devices
can be connected to a computer using a USB data cable. If hooked up to a
computer, the USB cable carries 5.0 Volts DC on its power supply pins, no
USB device can suffer any damage from this voltage. In order to save the
space for a dedicated power supply jack, many portable devices now use their
mini-USB jack for charging internal batteries. In the case of Lithium based
chemistry the battery needs a precisely limited end of charge voltage
(nominal 3.6 Volt Li-Ion cells usually need 4.20 Volts). The charging
circuit is inside the portable device, but a DC power source is needed to
feed it. What is called a charger is in these cases really a DC power supply
or AC adaptor. You can find three different types of AC adaptors: The most
primitive type is relatively heavy and for a specific input voltage, it
contains an isolating transformer, a rectifier and most of the time a
capacitor to reduce AC ripple. A bit more complex are AC adaptors with a
stabilized DC output voltage that contain an electronic circuit that
stabilizes the output voltage as long as the load is within limits.
Excess-voltage is converted into heat by a linear regulator.

The third type is the universal input voltage / worldwide or switching power
supply. They are usually lighter, convert the 50 or 60 Hz AC voltage to DC
and then convert that electronically to high frequency AC which requires a
much smaller and lighter transformer. They vary the AC input to the little
transformer to compensate for deviations in the output voltage. All AC
adaptors that carry the UL sign have to include some means of overload
protection. Some adaptors will fail permanently when that means has tripped,
others will reset once input power or overload is removed.

Some AC adaptors may deliver a slightly higher DC voltage than the nominal
5.0 Volts to achieve a higher initial charge rate for the battery, but 5.0
Volts is safe with any USB interface. As mentioned before in this thread, it
is generally safe to use a power supply with a higher current rating, but
not with a much higher output voltage.
 
J

Jon Slaughter

Jan 1, 1970
0
Hi Jon,
Thank you very much for the detailed explanation on USB power supplies.
I now that I should have paired the DEVICE requirements with the USB POWER
SUPPLY capabilities:
DEVICE = Blackberry 8700 SUPPLY = TCPRIM2ULSSN 5.0vdc 750mA
DEVICE = Motorola V195 SUPPLY = PSM5037B 5.9vdc 375mA
DEVICE = Motorola RAZR SUPPLY = DCH3-05US-0300 5.0vdc 550mA
DEVICE = Motorola Earbud SUPPLY = FMP5185B 5.2vdc 450mA

From the discussion, can I "assume" if I hook the Motorola V195's USB
power
supply (5.9vdc 375mA) to the Blackberry 8700 device, that the Blackberry
will be getting more voltage than it 'expected' and that the current
delivered will be much less than expected (even more so due to the higher
voltage than expected)?

yes. Using the 195 is not a good idea. It has the lowest max current rating
and to high of a voltage.

Note though that these are MAX current safety ratings. They are the maximum
possible current that the device will draw with some headroom for safety at
the specified voltage. So in reality your Blackberry might only use 100mA
on avg. For example, I have some security cameras that use 80mA at 8V in
daytime but at night use 160mA because it switches to IR. Here I would need
a power supply that will deliver atleast 200mA(but at ~8V). Well, that is
unless I knew that the camera would not be ran at night and there would be
no way the "night vision" would be used(like if I disabled the photo sensor
to make sure).
This implies that USB chargers are NOT interchangable!
(The T-Mobile store clerks just might have been right.)

well, not all power supplies are interchangble. This is why you see so many
with different sizes. Some are very small because they do not have to supply
a large current to the device they were intended to use. If you hook them up
to another device that would use the same voltage but draw more current then
it will burn up the power supply(and possibly ruin the device).
But, what irks me is they all have the SAME CONNECTIONS!
They all "LOOK" the same to me!

lol. Yes. It can be annoying. My father has about 50 of these wall warts for
all his junk and there are about 4 different connectors. Some are very odd
while all the others usually use that "standard" jack that you always
see(not sure what the name of it is).

Does EVERYONE label all their USB chargers so they don't mix them up?
Or am I missing something fundamental here.
If it says it's a USB charger, but that we can't use them interchangably,
then are they REALLY USB power supplies?

Well, all power adapators should be labeled with the power(voltage and
current) that they can supply. The device should also say which one it
uses. I'm sure that in some cases they do not say this on the device itself
probably because the manufacturer wants increase the likelyhood of the user
ruining the device so they will spend more money on it to get it repaired or
buy a new one.

Ultimately though these are just power supplies/adaptors and do not really
have anything to do with USB. USB is a standard that devices how "USB
Devices" will behave and communicate to other USB compatible devices.

The odd thing is that USB devices are defined to use 5.0V. This is strange
because
DEVICE = Motorola V195 SUPPLY = PSM5037B 5.9vdc 375mA

Is using 5.9V which is almost a full volt over. There could be many reasons
for this. Maybe the power supply is wrong. Maybe the device does not conform
to the USB spec. Or maybe the device can tolerate a large voltage
swing(maybe it uses 5.0V just fine). It could also be that the device uses
the 5.0V on a proper USB connection but not in the power supply part. i.e.,
there could be a difference between the USB port and the power port. Those
extra 0.9V could be from some extra safety feature the device uses on the
power port.

I'm not quite sure the reasons though.
I'm still confused on the fact that the charger advertises it is USB but
it's not USB if it doesn't fit all USB devices.

Ok, suppose you go into a store and you need a "USB" charger replacement
because you lost yours. You find many different adaptors saying they are
USB chargers. In fact there is no such thing. They are all just power
supplies/adapators. Its possible some manufacturers created a bunch of
power supplies for some device and then decided to sale them on the side and
call them USB Chargers because they were ~5.0VDC.

What I'm trying to get at is that USB does not device power supplies or
chargers but only states that USB devices should use 5.0VDC. Just like all
computers use a power supply that supplies 12VDC and 5VDC. If you have a
USB device then it *should* use 5.0VDC(but I suppose there are times when it
can't or won't) but there are many non-USB devices that also use 5.0VDC.

You also have the issue of current rating. Even at the same voltages you
have to make sure the supply can give enough current. All power constant
voltage power supplies have this issue because it is one of resource. The
larger the max current rating the more expensive the supply(in general).
This is why sometimes you'll find these little crap power adapators with
100mA. They are practically useless to use for most devices because most
devices use more than 100mA.

Can someone clear up the USB part of the confusion here?

I don't think I can but only say that sometimes a manufacturer has its
reasons. Sometimes they might change the power connector to be unique so
only the power supply they make will work for it. I assume they have there
reasons. Ultimately it would be easier for them to have one type of
connector for each voltage rating and have all power supplies have a max
current rating of 1A. This way you probably need at most 10 power supplies.
One for 1V, one for 2V, etc... All 1V power supplies would work with all
devices that ran at 1V(except those that used over 1A). Ofcourse if you had
more than 1 device at the same voltage rating to use at the same time you'd
need another power supply.

Essentially power supplies with the same connectors and same voltage are
interchangble if they can supply the maximum current the device uses.

Another thing to point out is that "chargers" are usually not the same as
adapatiors/supplies. Chargers usually have constant current and somewhat
different beasts. Constant voltage sources will keep the voltage constant
and allow for the change the current while constant current sources will
keep the current constant and allow for the change of voltage. Both cannot
change the part that varies from -infinity to infinity and different
adapators will have different ranges(usually 0 to max_cur for a voltage
source).

Anyways, If your trying to find one adapator for all the devices then it
seems your going to have a tough problem. Motorola V195 is just to far out.
You can see if it will work with a 5.0V source though. If it does then it
might be ok(although you might have wierd issues). For all the others you
can get a 5.0VDC@1A and it will work for all of them(well the 5.2V is a
little iffy but it should be ok). The issue then becomes one of connector
type. A all of them do not use the same connector type then you'll probably
want to by a multi-connector adapator.

Now if you know about electronics you could design a variable voltage source
where you could select the voltage you wanted almost exactly. I have no
idea if they sale these things to the consumer though but they are nice to
have. I assume they do not because it requires some knowledge about how
power works(The stuff I said about basically) and one has to make sure they
do not use the wrong settings or they could ruin the device.

Ultimately it would be nice is manufacturers would get there stuff together
and all use compatible power supplies for the same voltage. Here the issue
is one of cost though and a manufacturer is not going to sink in the extra
money on a power supply that is only "half" used.


Jon
 
A

Allodoxaphobia

Jan 1, 1970
0
But, what irks me is they all have the SAME CONNECTIONS!
They all "LOOK" the same to me!

Standards are GREAT!! And, there are _so many_ to choose from.

Jonesy
 
B

Bill Funk

Jan 1, 1970
0
I'm still confused on the fact that the charger advertises it is USB but
it's not USB if it doesn't fit all USB devices.

Can someone clear up the USB part of the confusion here?
Emily

Nominally, they are all the same.
This means that as long as the connectors are standard, they will all
charge OK with the supplied voltage.
The problem that can occur is that the voltage and current is limited
at the *original* port; this means that if you are using a non-powered
hub, voltage and current may well not be up to the requirements of the
device.
 
J

John Tserkezis

Jan 1, 1970
0
Jon said:
Is using 5.9V which is almost a full volt over. There could be many reasons
for this. Maybe the power supply is wrong. Maybe the device does not conform
to the USB spec. Or maybe the device can tolerate a large voltage
swing(maybe it uses 5.0V just fine). It could also be that the device uses
the 5.0V on a proper USB connection but not in the power supply part. i.e.,
there could be a difference between the USB port and the power port. Those
extra 0.9V could be from some extra safety feature the device uses on the
power port.

I'm not quite sure the reasons though.

The real reasons are quite sad actually.

During the design phase, they say "we need some wall warts- which ones are
we going to get?".

So they go out and shop around, and finally find some at the right price
(the right price always being the CHEAPEST price), and see they are of a
sufficient power rating, but the voltage is 5.9v. An odd number, yes, but as
it turns out, during the design phase it's largely irrelevant anyway.
And they say, "thank you, we'll take ten thousand of those please".

Then they design the equipment AROUND that.

The end-user doesn't care, because they have a wart assigned to a black box,
and all is well.
Until you get a user that looses their XYZ model wall wart, and tries to use
another.

Then the question is raised- "will my 5.9v equipment work with a 5v wart?"
The answer would be "it depends on who designed it". And you're not likely to
get a more definative answer from the manufacturer no matter how loudly you
ask, because their answer will always be "with our XYZ model wall wart".
 
T

tim

Jan 1, 1970
0
Is it true that one MUST use ONLY the EXACT USB charger that
comes with their cell phone or digital camera?

Are USB chargers really not interchangable?

The T-Mobile store told me I could only use their T-Mobile
charger for my new USB based cellphone. After showing me the
charger for the Motorola V195 which is 5.9 volts 375ma, they
then opened a desk drawer and handed me three melted USB
chargers from blackberrys & digital cameras, one at 5.0 volts,
750ma; another at 5.0 volts, 550 ma, and yet another at 5.2
volts 450 ma.

Can we swap these supposedly USB chargers or not?
- Blackberry TCPRIM2ULSSN 5.0vdc 750mA
- Motorola PSM5037B 5.9vdc 375mA
- Motorola DCH3-05US-0300 5.0vdc 550mA
- Motorola FMP5185B 5.2vdc 450mA

Why is Blackberry USB different than Motorola USB which is
different in and of itself? Can we swap these USB chargers or
must we stick to the charger that came with the device?

Emily

If a device is going to use the USB connector to supply input
voltage for charging, it MUST be able to charge when connected to a
USB port on a computer, which is a nominal 5.0 volts. (I am sure
there is a minimum current that must be available as well...)

There are several different ways in common use to limit input
voltage to a given level (in this case 5.0v). Most can dump up to
about a volt over without overheating or otherwise letting the
magic smoke out. In that case all four of the above power supplies
should be adequate for your device.

On the other hand, if, as one poster suggested, they are merely
using USB connectors because of their ubiquitesness (read
availability [I always wanted to use that word someplace!]) then
all bets are off. If that is the case, then connecting to a real
live USB port will probably NOT charge the device properly if at
all. I could see in certain circuit configurations an increase in
current demand to attempt to make up the deficiency in voltage,
with the suggested results.

First off, read the documentation. If it explicitly tells you that
a computer USB port will not work, or that the device should never
be plugged into such a port, then case #2 applies and all bets are
off. If the documentation states that a real USB port can be used
to supply charging voltage, then its pretty much anything that
comes close should work.

I personally lean towards case #1, and believe that the other
supplies were fried by some other factor. One that comes to mind
would be trying to charge an almost totally dead device while
trying to use it at the same time. That will definately up the
power requirements to the point that it would be no surprise to
have a unit melt down.
 
R

Rich Grise

Jan 1, 1970
0
On the other hand, if, as one poster suggested, they are merely using
USB connectors because of their ubiquitesness (read availability [I
always wanted to use that word someplace!])

Use "ubiquity". You'll sound much more sophisticated. ;-)

Cheers!
Rich
 
R

Richard Thomas

Jan 1, 1970
0
Note though that these are MAX current safety ratings. They are the maximum
possible current that the device will draw with some headroom for safety at
the specified voltage. So in reality your Blackberry might only use 100mA
on avg. For example, I have some security cameras that use 80mA at 8V in
daytime but at night use 160mA because it switches to IR. Here I would need
a power supply that will deliver atleast 200mA(but at ~8V). Well, that is
unless I knew that the camera would not be ran at night and there would be
no way the "night vision" would be used(like if I disabled the photo sensor
to make sure).

It's also worth pointing out that in many cases, the charger may not
have been designed specifically for the device in question but may be
an "off the shelf" power supply "guts" with a USB (or other) adaptor
thrown on the end and a case with a Motorola/Nokia/Rim logo added for
looks. The charger may actually be able to supply much more than what
is actually required simply because that was the cheapest choice.
Is using 5.9V which is almost a full volt over. There could be many reasons
for this. Maybe the power supply is wrong. Maybe the device does not conform
to the USB spec.

Definitely way outside the spec (5%). If I had one of these, I might
be tempted to junk it just to be on the safe side. However, a quick
search shows that this is the old V400 style charger and /not/ USB.
Essentially power supplies with the same connectors and same voltage are
interchangble if they can supply the maximum current the device uses.

Unfortunately, that is not the end of the story. The USB spec also
states that a device should negotiate the current that it is supplied
to it. Some devices don't worry about this and will happily charge as
long as there is 5V on the pins but, for example, my Motorola SLVR
requires more than that to charge. If I plug it in to a USB cable that
is plugged into a 12V cigarette lighter adapter, it will not charge
and will act as if it is not plugged in at all. However, if that same
cable is plugged in to a computer, it will fire up fine. I also have a
car charger that works no problem. My wife's RAZR (may it rest in
peace) wouldn't even charge on a computer unless you installed the
Motorola drivers first.

My Palm TX will charge properly with my sync/charge cable (puts power
in the little square plug) but if I just use the USB sync cable that
came with it, if plugged into a computer, it will charge normally but
if plugged into the 12V adapter, the charge indicator will not come on
(it will however charge extremely slowly). This is about to become an
issue for me as I'm planning on installing a mini-USB connector in the
Palm in the next couple of days. I never seem to be able to put my
hands on a Palm cable when I need it but mini-USBs are ubiquitous now
(Dollar Tree was selling retractable ones for $1 until recently).

So in short, no, not all USB chargers are equal. They /should/ give
you the 5V and they /should/ be able to supply the current you need
(if they conform to spec which is by no means guaranteed) but even at
that point, you need to know if your device can be charged with a dumb
charger and whether the charger is dumb or not.

Rich

Oh, just remembered, my GPS, which is happy with dumb chargers, will
not charge with my Motorola USB non-dumb charger. So presumably the
charger is not willing to supply the current until negotiated with?
It's all fun and games I guess :D
 
J

J. Clarke

Jan 1, 1970
0
Emily said:
But what I don't understand is why the amperage and voltage are
DIFFERENT for the USB chargers.

Isn't USB a standard?

Why would some chargers be almost 6 volts and others be 5 volts?

The USB standard says power must be between 4.75 and 5.25 volts, with
maximum draw from a dedicated charger 1.8A. That's the highest
allowable--it doesn't mean that any given device is required to draw
that much. The manufacturer of a device will generally make the
charger the cheapest they can, which means that it will provide just
enough power to charge their device and no more.

That does not mean that particular vendors adhere to the USB standard
in the design of their charging systems--seems a stupid way to
alienate customers to me but I don't run things.

From a theoretical viewpoint if the charger delivers the rated voltage
and the same or more current then it should work. From a practical
viewpoint this assumes that the device doesn't change its current draw
depending on what it's plugged into, and since USB is a communications
protocol as well as power delivery it's reasonable to expect devices
to have the capability to make that determination.
 
M

Michael Black

Jan 1, 1970
0
Pay attention bozo. The message you replied to is over 2 years old,
which you'd have seen if you'd checked the date. The discussion has
long moved on, and chances are good that many who were reading the
newsgroups at the time are no longer around.

And if you'd been around back then, you'd know that this "Emily" was
regularly posting about this "I thought USB was USB", and lots
of people would have given a decent answer. Nothing you could
say 2 years later adds to the conversation, especially not when
anyone is around to care.

Michael
 
Top