Samuel Groner said:
Iron/ferrite core inductors will generate distortion if significant current is flowing through them. I don't have a good estimate for the order of magnitude at hand, but as in this position quite a bit of current flows air-core might be beneficial.
I agree, and want to reiterate the concept of "significant current". The key here seems to be using an inductor that saturates at current levels that are 3-5 orders of magnitude higher than the expected signal currents. When this is done, I have empirically found that a ferrite bead rated for 3A has absolutely no effect on a line level signal of around a volt loaded into something as low as 600Ω, down to the residual level of an APx-555 with heavy FFT averaging, basically to the -150dBC level.
Inductor distortion seems to rise rapidly with signal level, but similarly, it seems to fall rapidly with decreasing signal level. So, if you stay far away from saturation, it's really difficult to detect any effect whatsoever with current test gear. One would think that "iron hysteresis" would cause problems, but again, this doesn't seem to result in "bad numbers".
I think there's an aversion to using inductive components for high resolution audio because of these suspicions, and I'm not sure that they're well founded, either by measurements or listening tests. Sure, inductors with high permeability cores can be misapplied, but just the same, if some simple concepts are applied, it seems to me that they can be used without causing detectable distortion.
I'm not sure what the margin of signal current to inductor saturation should be, but given that "easy to apply" parts offer current overload points of 3A and we can use them for line level signals of around 1mA and get great results, it might not require so much thought. I can't measure problems, and any of the simple models won't predict problems, so maybe it just works well?