Phil said:
** Completely WRONG.
Iron core transformers in external PSUs are about to
* disappear * - you ASS.
The REAL cost increase to the public may well be horrendous.
http://sound.westhost.com/articles/external-psu.htm
Rod's statements about linear transformers have been
shown to be wrong, and I and others have theory and
measurements to prove it. Why do you keep quoting
his web page as if it represents some kind of proof?
(Note, I take no issue with Rod's rants about SMPS
supplies, only with his hastily-drawn conclusion about
linear-transformer supplies. I have gone further and
read the 109-page RIS document, and see that it simply
states the Energy-Star specifications that were adopted
as a voluntary measure in January 2005. The mandatory
nature of the RIS proposal raises my eyebrows; I'd
want exceptions for low-sales-volume specialty items,
etc., especially if no replacements are available.)
Yes, there will be an impact on what manufacturers make,
but in fact they will simply be returning the smallest
transformers in their line to an efficiency level similar
to their large ones, in some cases, and just a little bit
better in others.
The famous "transformer formula" Bmax = 10^8 Vp / w Ae N
(B is in gauss and Ae in cm^2), tell us the maximum flux
in a core can be reduced by increasing core area Ae, and
by adding turns N. My measurements showed how doubling
N decreased the peak primary magnetizing current over 10x,
and reduced the transformer's standby power to 0.40 watts,
meeting the new regulation. That was the equivalent of
doubling the product Ae * N. Note, along with the lower
core losses come lower magnetizing-current copper losses.
The transformer size will certainly have to increase to
accommodate the increased copper and core. Transformer
manufacturers can make their best choice of the numbers,
and they can also increase their iron-lamination quality,
as others have pointed out.
It's crazy that small transformers now run at efficiency
levels of 6/7.3 = 82%, whereas big ones run at 97 to 98%.
That's 18% loss vs 2 to 3%, or at least 6x worse. How
can you claim that getting rid of 6x, doing exactly what
they already know how to do, is not worth doing or is an
unsurmountable problem?
I'm sure they thought we wouldn't care if the transformer
design wasted 1.3 watts, vs 0.4 watts (my measurements),
and knowing they'd save money and their competition would
do the same, they went ahead with the poor design. But
that 0.9 watts difference is costing me $14.20 / decade
(at today's rates), so I'll tell you, the dollar savings
or whatever they made doesn't look so sweet.
So damn yes, if it takes a government to fix that, great!