If you see high failure rates at 80% rated voltage, something's wrong.
Perhaps the circuits are such that voltage fluctuates. Or maybe theyre
low grade caps.
It's something that I've seen for 35 years with all grades of caps ...
The load is a bridge rec + reservoir, so it only charges the reservoir
caps at the peaks. Most of the time i=0, and at peak i= several times
average. Copper losses have a bigger effect with peaky waveform on a
low power and thus poor regulation transformer.
I think that this is a highly debatable way of looking at it. If the cap is
of a sufficiently large value, the charging 'peaks' on each cycle should be
small, once the cap has gone through the initial charging phase over the
first few cycles after power up, otherwise you have significant ripple,
which I'm sure you would agree, is not the case with most properly designed
power supplies. The cap does the averaging, so the current demand on the
transformer, is pretty much constant rather than 'peaky'.
averaged over each cycle yes, but instantaneously its the other way
round.
See above
no... thats a 9w transformer. Why would one fit a 9w tf to a 3w app?
OK, maybe an overkill, but we already agreed that the curent demand of this
item is likely below a couple of hundred mA, so maybe a 5 or 6 VA tranny,
which is a typical size that would likely have been fitted originally. Even
at this level, I still contend that on a reasonable quality tranny, copper
losses won't be significant.
an old fashioned inefficient way to do things. Cheap volt regs make
such practices unnecessary today.
I wasn't suggesting that this was a good thing. What I was trying to say is
that if a designer decided that say 18v AC was required to arrive at the DC
level he needed on the back side of his bridge or whatever, then he would
have to take account of the fact that a cheapo small tranny with poor
regulation, would be likely to produce a significantly higher level than
that calculated and, because of the very light loading, it would be unlikely
that this value would drop to what was actually required, as a result of the
copper losses that you are fond of ... Cheap voltage regs by no means
mitigate the potential problems of this as, first off, we come back to the
level of voltage that you are throwing across the resevoir cap before we get
near the regulators. Secondly, these monolithic voltage regs are quite
inefficient, being shunt types, so dissipate quite considerable amounts of
power, which is why it is important to keep the input voltage as low as is
practical, above the required overhead for correct regulation. If a circuit
is designed for a particular input overhead, based on what the calculated DC
*should* be, and then that DC turns out to be 15 or 20% higher due to poor
transformer regulation, this is going to significantly increase the
dissipation in the regulator, which might mean that the calculated
heatsinking that was required, is no longer sufficient, which could lead to
the regulator starting to go into thermal foldback, which completely wrecks
any stabilization that it was bringing to a rail. This is another reason why
it is important that the OP gets it right, and why I doubt that the raw DC
was originally anywhere near 40v.
12% regulation is a rephrasing of what you stated there. Unless you
mean 28v due to some other cause.
No, and again, you seem to misunderstand. I am perfectly capable of
calculating that the 3v I'm suggesting represents about 12%, and that is
exactly what I was intending. I just felt that in this particular context,
an actual voltage was more 'meaningful' than a percentage. I was in fact
referring to your " or maybe higher. We really don't know ... " What I'm
saying is 28v on a 25v nominal tranny output is bad, higher, if you think
that it might be, is even worse.
exactly. If you work through the theory + numbers you'll see what
youre doing creates results that work fine until mains sags, then they
go wrong. A designer has to make circuits that tolerate the usual
overvoltage and undervoltage limits, whereas when repairing this is
optional in practice.
I agree, but there are limits, and sags of the sorts of level that you are
suggesting are pretty significant, and much worse than I would have expected
over most of the civilised world. I see many many group amps and hifis for
repair, all of which employ some kind of regulators, and most of which use
78 and 79 series ones, which as you rightly say, are cheap. Most group amps
have semiconductor front ends these days, employing opamps, run very
typically from +/- 15v rails, derived from 78 / 79 regulators. It is
*exceedingly* rare for the input to these to be in excess of +/- 25v. In
practice, even if the regulators did drop out of tight control for brief
periods of excessive power line sag, it is unlikely to have a significantly
noticable effect on the performance of the opamps, and I think that most
designers would be prepared to accept occasional poor regulation on these
occasions, as a trade against excessive regulator dissipation in the vast
majority of circumstances.
Surely it should be as expected, else you've miscalculated.
No, because the real world calculations will not match the theoretical
calculations, because I still maintain that in cases of very light
transformer loading, the copper losses will *not* be significant. This means
that you need to calculate for nominal output volts plus the overvolts from
the transformer regulation factor. Perhaps I should have made it clearer and
said " how much higher the output voltage will be than the nominal
transformer output voltage, would lead any calculations based on that, to
suggest ".
Here (EU) all new goods can be expected to function correctly with
real world mains over- and under- voltage.
NT
Being in the EU myself, I am aware of this, but I don't think that there are
too many places in the world where 15% sags are the norm. On a nominal 230v
supply, that represents around 35v. I would be pretty pissed with my power
supply company, if my house input was dropping below 200v on a regular
basis. In the U.S., such brown out events do occur, but I'm willing to bet
that not too many items of equipment with linear, rather than switch mode
supplies, are able to cope with a drop 15% in their incoming line voltage,
without showing some operational signs of it. Maybe I'm wrong on that.
Perhaps someone in the U.S. would care to comment ?
Arfa