Major Myth Regarding Capacitance
I only heard about this myth recently, and while I can imagine how it came about, it's completely bogus. Some people claim that as the capacitance is increased for a given sized transformer, the peak current is also increased. There are conflicting additional claims that the RMS input current to the transformer either A) does, or B) does not increase as well. Added to this is a further claim that the transformer will overheat because the current is higher.
In essence, this is all complete rubbish. Incorrect measurement techniques or bad simulation practices may lead one to believe that this is the case, but it is not. The important thing is that we can only examine the steady state current - inrush current will quite obviously be greater with larger capacitance, but this is a transient event. Because transient events are just that - transient - there is no point analysing them and making absolute claims, because every transient will be different. Transformers can survive massive short term overloads without any harm, and a soft start circuit will tame the transient currents to something less scary.
The steady-state conditions are applicable to most power supplies within about 100ms after power is applied. If one were to use a 2 Farad capacitor on a 15VA transformer, this time will be extended considerably, but this would be silly, and we are not interested in the effects of silly combinations.
If we use the transformer/rectifier circuit described above as an example, we can either measure or simulate the effects of using a much larger than normal capacitor. As shown in Figure 2, the selected capacitor is 4,700µF and the load current is 1.44A - all fairly normal. The transformer secondary current is 2.7A RMS, so a 120VA transformer is well within its ratings. Even overloads are not a problem - if they are infrequent, the transformer will be perfectly happy as long as it has a chance to cool down so its maximum temperature is never exceeded. A fan can be used to increase the VA rating of most transformers, albeit with some variability.
No problems so far. However, many audiophile expectations will demand that the capacitance be at least 10,000µF, around 50,000µF for passable performance, but (of course) 100,000µF would be much better. This is (IMO) rather pointless. I won't argue with 10,000µF, but any more is really wasted and not necessary.
Now, according to the myth (sorry - 'theory'), this extra capacitance will cause the transformer's RMS current to increase, accompanied by a dramatic increase (or not) of the peak current - all during steady state conditions. It simply doesn't happen that way.
Adding more capacitance will ...
Decrease the ripple voltage
Increase the average DC voltage very slightly
Increase the inrush current (dramatically for larger capacitance values)
Barely affect the steady state RMS current
Have almost zero effect on the steady state peak current
Not cause the transformer to overheat, provided sensible limits are placed on the cap value
What is sensible? As with all things, it depends on the context. For a 25V transformer providing a worst case rectified and smoothed current of 1.44A into a 20 ohm load (as described above), a sensible upper limit would be perhaps 50,000µF, although even 100,000µF will cause no harm. Sensible values are those that consider the law of diminishing returns, where, after a certain point is reached further increases yield little additional benefit.