I like Thor's solution of a 'master' supply running into a bunch of other converters as necessary BUT, the big BUT, they have to be effectively filtered and screened to a good standard afterwards.
Not so. They need to work at a sufficiently high frequency AND synchronised.
These days 1MHz+ is pretty much standard for most up to date parts. Beat-notes result from non-synchronised switchers.
Another point is to filter the noise and place the "noisy" supply onto it's own copper island. Non of the switching frequency should be allowed to escape.
This needs to balanced somewhat with the need to drain EMI into earth/mains. Ideally we have a mains earth on the chassis, in which case we can "soft earthing" to drain EMI from both the audio circuit and the switchers into earth.
In that case having chokes in the negative input line and in the supply grounds is possible, so our "noisy island" is referenced to earth but otherwise isolated for RF.
In cases without earth, it usually needs a ferrite bead in the negative supply line (which is referenced either to mains earthy or coupled to mains as common mode) and a ferrite bead in one (or all) ground connection to the audio ground, to allow a path for RF for EMC testing.
open it up to 300KHz and actually LISTEN to the noise it is riddled woth little birdy noises and 40dB MORE noise.
This can have reasons other than leakage from power supplies, one example, noise shaping in AD or DA Conversion or in Class D Amplifiers. Many DS modulators when running "idle" tends to create birdies (low level idle tones) as well, they go away with sufficient signal, like a bit of pink noise dither a few dB above the modulator noise floor.
Also, self oscillating Class D Amplifiers (which are pretty much the majority) must be not only synchronised to avoid beat-notes, but due to their relatively low switching frequency, classic LC filters with (say) 40kHz -3dB point are woefully inadequate to suppress a switching frequency at (say) 400kHz which further dynamically drops under load, as they only manage 40dB/decade suppression.
So if we have a Class Amplifier with 50V rails and 400kHz switching frequency and a 40kHz output filter, we will see around 500mV @ 400kHz at the output.
Further, at the top of a sinewave test signal the switching frequency will drop by 25...50%, so with a 50% we actually have 200kHz, so our RF level at the output varies dynamically from 500mV at 400kHz to 2,000mV at 200kHz WITH THE MUSIC SIGNAL.
It will pass EMC testing, but what will happen to the sound? Anyone know the key mechanism for electrical distortion in transducers? Eddy current distortion. And it rises with both signal and frequency in a cubic function.
Obviously a correct output filter for a Class D Amplifier should suppress the switching frequency (or carrier as I like to call it, betraying time spent in RF) by 100dB+ under all conditions, for a "High Quality" system.
Thor