midwayfair
Well-known member
I've been wondering why gear designed to deliver a signal of 3.472V peak to peak tends to run on supply voltages many, many times that high, usually 24-36V. It's going into digital converters that often don't want a signal even as high as 8V, which is a pretty hard ceiling even if transients are preserved by .
Do the high supply voltages have something to do with the high output current requirements for older equipment with very low impedance inputs? I know that the clean range of a device like a transistor or op amp is smaller than the supply voltages, but it doesn't seem like it would be 1/10 the supply voltage range even in the worst of circumstances.
The only other thing I could think of is that maybe it's to avoid excess heat from the power transformers dropping the voltage more, but then again high-current output stages probably tend to create more heat than a bigger supply voltage drop would.
I can't really find any explanations of this so I'd appreciate a bit of insight. Thanks!
Do the high supply voltages have something to do with the high output current requirements for older equipment with very low impedance inputs? I know that the clean range of a device like a transistor or op amp is smaller than the supply voltages, but it doesn't seem like it would be 1/10 the supply voltage range even in the worst of circumstances.
The only other thing I could think of is that maybe it's to avoid excess heat from the power transformers dropping the voltage more, but then again high-current output stages probably tend to create more heat than a bigger supply voltage drop would.
I can't really find any explanations of this so I'd appreciate a bit of insight. Thanks!