bjoneson
Well-known member
I think I might finally have my brain wrapped around all of this...
"Input Impedance" - the impedance as "seen" from the source feeding the input. The signal voltage must be developed across this impedance (load). It makes sense to me now, why most modern inputs are relatively high impedance. The higher the input impedance, the "easier" it is to develop the signal voltage across it (less current required). In modern circuits, the "signal" is the voltage, the driving current is much less relevant. So why not have an astronomical input impedance like 10M Ohms or something? The tradeoff is that the higher the impedance, the more noise is introduced (Johnson, etc...). So it seems a good practice is to have the minimal input impedance that the connected device can "comfortably" drive. All of this I, I now feel pretty confident with.
"Output Impedance / Source Impedance" - There's many, many articles / resources that define it. It's the impedance as "seen" from the outside looking back at the input. Where I've fallen down in understanding, is why does the source impedance matter? General rule of thumb is that it should be "low", but why? If we're just trying to develop a voltage across the "Input Impedance" of the receiving device, why does output impedance even matter? It's not been clear to me how it fits into the equation.
I finally think I got a grasp on it though. If I understand correctly, technically the output isn't just developed across the input impedance of the sending device. It's developed across *both* the output impedance and the input impedance in series. This creates a voltage divider with the "input" of the receiving device at the node between the source impedance and the input impedance. The ratio between the source and input impedance determines the signal loss (voltage drop).
Have I graduated from Impedance Apprentice, to Impedance Journeyman yet?
"Input Impedance" - the impedance as "seen" from the source feeding the input. The signal voltage must be developed across this impedance (load). It makes sense to me now, why most modern inputs are relatively high impedance. The higher the input impedance, the "easier" it is to develop the signal voltage across it (less current required). In modern circuits, the "signal" is the voltage, the driving current is much less relevant. So why not have an astronomical input impedance like 10M Ohms or something? The tradeoff is that the higher the impedance, the more noise is introduced (Johnson, etc...). So it seems a good practice is to have the minimal input impedance that the connected device can "comfortably" drive. All of this I, I now feel pretty confident with.
"Output Impedance / Source Impedance" - There's many, many articles / resources that define it. It's the impedance as "seen" from the outside looking back at the input. Where I've fallen down in understanding, is why does the source impedance matter? General rule of thumb is that it should be "low", but why? If we're just trying to develop a voltage across the "Input Impedance" of the receiving device, why does output impedance even matter? It's not been clear to me how it fits into the equation.
I finally think I got a grasp on it though. If I understand correctly, technically the output isn't just developed across the input impedance of the sending device. It's developed across *both* the output impedance and the input impedance in series. This creates a voltage divider with the "input" of the receiving device at the node between the source impedance and the input impedance. The ratio between the source and input impedance determines the signal loss (voltage drop).
Have I graduated from Impedance Apprentice, to Impedance Journeyman yet?