Preamp difference : if it's not the frequency, not the slew rate, and not the harmonics, what is it ?

GroupDIY Audio Forum

Help Support GroupDIY Audio Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Are you talking about deviation from expected response, or normal phase shift wrt frequency.
The normal phase shift wrt frequency. This is rarely uniform across the frequency response of the preamp (or most other devices).

Phase response or phase shift is predictable
Yes. It's a mathematical function, and can be modelled fairly easily assuming one has the full specs of the components used ... including ESL ;)
 
By definition, a DC-coupled amplifier passes 0 Hz.
Yes indeed and many instrumentation and measurement applications use this approach but for audio applications, almost all preamps/amplifiers which are DC coupled internally have some form of coupling capacitor on their input to prevent small DC offsets on the input signal being amplified into large DC offsets later on in the signal path which at best would lead to the likelihood of asymmetrical clipping and at worst would cause speaker voice coils to quickly burn out. Hence their frequency response doesn't usually extend as low as 0Hz.
 
Yes indeed and many instrumentation and measurement applications use this approach but for audio applications, almost all preamps/amplifiers which are DC coupled internally have some form of coupling capacitor on their input to prevent small DC offsets on the input signal being amplified into large DC offsets later on in the signal path which at best would lead to the likelihood of asymmetrical clipping and at worst would cause speaker voice coils to quickly burn out. Hence their frequency response doesn't usually extend as low as 0Hz.
Which makes them AC-coupled. As such, the LF phase varies in a predictable way that is determined by the cut-off frequency.
 
Electronics might disagree ...

Simple RC network, left simulation is 100Hz, right hand one's 1Khz.

View attachment 129859 View attachment 129860
Can you determine the phase difference between the 100Hz and the 1kHz tones after they passed through this shelving high-pass filter?
I can't.
Nobody can, because two signals of different frequency cannot be neither in-phase nor out-of-phase or whatever in between.
Phase difference is a concept valid only for signals of identical frequency.
Correct simulation, wrong conclusion.
 
Last edited:
Can you determine the phase difference between the 100Hz and the 1kHz tones after they passed through this shelving high-pass filter?
it's certainly measurable and can be calculated. In the above example, at 100Hz the phase shift's about 20 degrees and at 1Khz it's somewhere around 2 degrees.
two signals of different frequency cannot be neither in-phase nor out-of-phase
try switching the wires on your mids so they're out of phase with your woofers ... ;)
Phase difference is a concept valid only for signals of identical frequency
phase shift exhibited by signals of different frequencies on the same signal path is very much valid, as my two screen clips above demonstrate
 
it's certainly measurable and can be calculated. In the above example, at 100Hz the phase shift's about 20 degrees and at 1Khz it's somewhere around 2 degrees.
You can't measure the phase difference between signals of different frequency.
try switching the wires on your mids so they're out of phase with your woofers ...
Completely unrelated. Speakers are non-minimum-phase. The effects you hear are those due to combining two different paths.
phase shift exhibited by signals of different frequencies on the same signal path is very much valid, as my two screen clips above demonstrate
Your screen demonstrates that signals of different frequencies are shifted by a different amount. I never questioned that.
You're trying to demonstrate that it's the probable cause for "bad" sound, when it's a normal consequence of the finite frequency response of any piece of equipment.
 
Check this circuit.

Phase-shift at 20Hz and 20kHz is the same except one is leading the other is lagging.
And phase-shift at both -3dB points is predictably 45°.

exactly my point - if "one is leading the other is lagging", the relative phase shifts at the respective frequencies differ. The circuit you posted shows one shift of a positive amount and the other with a negative amount. While their absolute values may be the same, if one's positive and one's negative, the relative phase shift between those two frequencies would be the sum of their absolute values, i.e. 90 degrees.
 
The article also clearly states that the perception of phase distortion in actual music was extremely low and that only certain types of test signals gave a high perception rate, this being amplified on headphones. Once you bring acoustic environments and speaker construction and positioning into play the marginal audibility of phase distortion tends to be a non issue - add a bit of reverb into a mix and you may as well forget it.
 
The article also clearly states that the perception of phase distortion in actual music was extremely low and that only certain types of test signals gave a high perception rate, this being amplified on headphones. Once you bring acoustic environments and speaker construction and positioning into play the marginal audibility of phase distortion tends to be a non issue - add a bit of reverb into a mix and you may as well forget it.
"Extremely low" is more than zero so it means the effect is perceptible. I have to say, my ears might not be able to specifically attribute any effect to phase distortion or even detect it in the first place but by changing the phase relationships of different frequencies, you will eventually change the tonal quality of the sound. Consider an open A string being plucked; the waveform generated is close to a sine wave. If that A string is mounted on a sound box, the tonal quality changes because some harmonics are at different levels although the fundamental note remains an A. Now think about a trumpet playing the same note, the sound heard and the waveform are quite different although the fundamental frequency is the same. I'm not suggesting phase distortion will make a harp sound like clarinet but slight changes in the phase-frequency relationship will inevitably alter tonal quality - i.e. phase distortion is audible to some people and by reducing it, we clean up the sound.
 
You should carefully read this article and related litterature. The effects of phase distotrion, i.e. phase response differing from minimum-phse behaviour, are audible by the way the peak factor changes and thus results in triggering non-linearities in the reproduction chain, including air compression and audition.
On the topic of “punch and smoothness differences” in preamps ...
 
I think the effects of mic placement in a studio would far outweigh any differences in a preamp. Also try putting a linear phase EQ on a single high attack sound like snare or hats and listen to the result - then compare it switching to normal EQ - the linear phase sucks the life out of the sound in the mid to high areas and audible ringing can occur especially at low frequencies. A lot of software EQ’s have a linear phase button these days. Linear phase can work when say double miking a guitar cab or other instruments but completely not necessary for single miked.
 
Back
Top