hartly said:
Why does my 9v D.C. transformer only give a max. amps. of 0.22A?.
I've looked inside it and it doesnt seen to have any resistor in
series to restrict the current.
A 6v. battery charger can give out 4 amps.,which proves its nothing
to do with the voltage of input or output.
Thanks.
The output current has to pass through the resistance of the secondary
winding. This uses up voltage and produces heat. The output current
(transformed by the turns ratio), in addition to the magnetization
current has to pass through the primary winding. This also drops
voltage and produces heat. A transformer is rated for output current
based on acceptable total voltage drop (called regulation, expressed
as %) and temperature rise. If they build the transformer with a
larger core that has a larger window opening, they can use larger wire
that drops less voltage and produces less heat at a given current,
raising the current rating. It turns out that the rated power (output
voltage times rated current) is roughly proportional to the weight of
the transformer.
You can get more current out of any transformer, if you accept the
lower voltage, and either do this briefly enough that the transformer
doesn't overheat, destroying the insulation, or are willing to put the
fire out.