There is several line driver style op amps that can go as low as 50 ohms that would work better.
"Better", how? And also, why???
I was under the impression that opamps, for the most part, had output impedances in the milliohm range, so for all intents and purposes (of line-level audio driving), negligible. The series 47-150 ohm resistors added to the output are for isolating the opamp from the capacitance of the unknown-length cables it may be asked to drive.
I could be wrong, but i thought line-level audio and welding were two very different applications of electricity...
But as going to a higher input impedance, it defeats the intended balance operation by definition.
Once again, how (does it defeat "the intended balance operation")?
The worst case was a studio monitor that was 47K input impedance that was fixed by putting a 150 ohm resistor across 2&3.
What sort of problem was that supposed to solve? And how and why was 150 ohm the "magic bullet" value, as opposed to, i dunno, 10k, 4.7k, 1k?
I'm all for "if it's stupid and it works, it ain't stupid", but the "why" behind it all can be useful to know for later - the method may need adapting to a new problem, where the initial (empirical?) solution may no longer be suitable.