I think, as I explained to Fritz Schlunder, that for small transformers like this, the
copper loss *is* what, to a great extent, determines the (power handling) "capability" of
the transformer.
---
---
NO!!! If you do you will exceed the current rating of the secondary
and, possibly, damage the transformer. The transformer is rated for
1kVA out of the _entire_ secondary which, at 12VRMS out comes to the
83.3 amps noted on the transformer's faceplate. That is, the
transformer secondary is wound with wire which is designed to carry
83.3 amps no matter which voltage tap is used.
I think you are quite right. I didn't read your first response to Rob where you
interpreted the label, which I also didn't bother to look at. :-( I was thinking that
the secondary had multiple windings which could be paralleled, which, after looking at the
label, I would agree that it doesn't. The 83 amps times 12 volts gives 996 VA, the rating
of the transformer. Mea Culpa. He should limit the primary current to 1.875 amps as I
mention later.
---
---
That's not really a valid criterion, since the core losses are only
going to be a fraction of what they would normally be with 240VAC on
the primary. Moreover, if you're going to do it properly you need to
monitor the temperature rise of the transformer over ambient and make
sure it doesn't exceed the spec.
I'm not sure what instrumentation Rob has access to; that's why I asked about meters.
If you're really going to do it right, monitoring the temperature rise of the
"transformer" over ambient isn't enough. You need to know the temperature of the
windings, and the rating of the insulation. Since the transformer doesn't appear to have
thermocouples buried in the windings, what is what I usually do with a new design, one
would have to use the method of measuring the resistance of the windings at room temp, and
then applying current for several hours, followed by a measurement of the winding
resistances when hot. From this the temperature rise of the windings can be inferred.
One starts out with less than rated current and makes a measurement, then if temperatures
are under allowable values, increase the current and take some more measurements.
Given a starting temp (room temp), designated Tcold, and measured resistance of a winding
at that temp, designated Rcold, and a measured resistance Rhot after allowing the
transformer to reach equilibrium with heating from the current in the windings; the temp,
Thot, is given by:
Rhot
Thot = ------ (234.5 + Tcold) - 234.5
Rcold
in degrees centigrade.
Since we don't know whether his transformer has Class C, Class H, or whatever, insulation,
we can't really tell what temperature rise is acceptable. One way to find out would be to
assume (there I go again) that the transformer was designed properly, and will exhibit a
temp rise at full load of just about the rating of the insulation system. Rob could short
the 12 volt winding, and apply rated current to the primary, wait a few hours and measure
the temperature rises of the windings. This would presumably be what the transformer was
designed for. He could then measure the temp rises with his rectifier load and see if
they are <= to what he got with the full load test I just described.
I'm sure you know all this, John, but I'm going into detail for Rob's benefit.
This is too much, as John points out. Don't do this.
Or limit the current to less than 83 amps in the shunt; this would be acceptable for the
6 volt tap. The shunt could also be used to short the rectifier output so that the actual
current there could be measured. To assess the heating in the 6 volt winding, a true RMS
meter would need to be used. The electrochemical effect of the current is proportional to
coulombs/sec, so an average responding meter would be appropriate to assess this.
Rob's original question was whether the transformer could supply 75 amps at 6 volts. It
would be reasonable to assume that the transformer can meet its nameplate rating. So if
he has access to a high current shunt, he can just put it in series with the 6 volt output
tap and, with a true RMS meter, verify that the current is less than 83 amps, and feel
safe without all this fooling around we've all been telling him to do!
I somehow doubt that Rob has any high current shunts around. Maybe he has access to some,
but I got the impression that he might be short on instrumentation.
So, what we all need to help him with is the measurement of the secondary current without
having to buy a high current shunt. One possibility is to make a current transformer
somehow. Or, how about this? Ten guage copper wire (here in the states, anyway) has very
close to .001 oms/ft. If he were to cut a 1.5-foot length of 10 guage wire, and solder on
a couple of smaller wires exactly 1 foot apart, he would have a shunt of 1 milliohm, and
with a current of 75 amps, he would need to measure 75 millivolts AC, and even the
cheapest Radio Shack DVM can do that. The problem is that the piece of wire would be
dissipating 5.6 watts and would get hot. The resistance would change enough to inspire a
vote of no confidence. He could parallel two such wires and have only 2.8 watts
dissipation, and finally, he could put the put the homemade shunt in a plastic tray of
water (at 20 degrees C), which would probably limit the temperature rise to an acceptable
value. (He can find out the resistance/foot of available wire of about this size and
figure out exactly how far apart to put the meter taps.) (And, he will need a true RMS
meter)
I'm aware of this, but as I explained in another reply to Fritz Schlunder, typically
small transformers like this aren't optimized for continuous operation at full load. It
could be, but I think it unlikely. Anyway, to do it right (measure the core loss) would
require a wattmeter designed for measurements under low power factor conditions.
But, he could get an idea of the relative magnitude of the core loss by measuring the room
temp resistances of the windings as mentioned earlier and then applying 240 volts to the
primary with the secondaries unloaded, waiting several hours, measuring the warm winding
resistances and using the formula given earlier.
And the distortion of the current waveform will make the the copper losses even more
significant compared to the core loss than with a pure resistive load.
As I explained to Fritz, this is only true for a transformer used nearly all the time at
full rated load.
In some special applications the disparity between core and copper loss is even greater
than the example I cited to Fritz. Inverters for off grid use are designed for very low
core loss because the customer wants to be able to leave the inverter running for extended
periods without a large drain on the battery. A transformer I designed for a 2400 watt
inverter had about 12 watts of core loss and 347 watts of copper loss (full load) on EI200
laminations, 2.75" stack height. I think inverter transformers represent the extreme of
this disparity.
A good example of a transformer where the core loss is prominent is the transformer in a
"wall wart". Those transformers are usually blazing hot even without a load. They use
cheap iron and run 'em hot. These transformers are very small, and the resistance of the
primary is so high that the volt-seconds seen by the core is greatly reduced by the IR
loss in the primary when the transformer is loaded. This reduction in core loss
compensates for the increase in copper loss due to load so that the transformer is within
ratings (barely, I think).