skidmorebay
Well-known member
- Joined
- Aug 1, 2011
- Messages
- 134
Funny, I tried to reply to an older post along these lines but got a response recommending I start a new topic because the old one was over 120 days old.
Can anyone help me out with some basic measurement questions? This concerns how to measure the amplitude of a signal being sent or received when calibrating an audio processor with balanced I/O.
Last night I was trying to calibrate the MixBuzz compressor I just finished building and got a little confused.
I have two ways to send out a sine wave - I can use the unbalanced, 600ohm output of an old function generator I have, or I can send out a balanced signal from my audio interface, using the signal generator in Pro Tools. I like to measure the signal (outgoing and returning) on my oscilloscope.
So, for instance if I need to send a 1V RMS signal to the device being calibrated I am right in assuming that:
AND, when measuring the OUTPUT of the device being calibrated, should I not put the oscilloscope probe between the hot and cold pins of the output XLR and read the amplitude of the device's output that way?
I get worried when performing these sort of tests because each possible combination yields a much different result! I want to make sure I am feeding the correct tone to the device during the calibration procedure. As they say, "Garbage In=Garbage Out." If I am not properly measuring the signal I send in, then I can't trust the signal I am measuring on the output.
[/list]
Can anyone help me out with some basic measurement questions? This concerns how to measure the amplitude of a signal being sent or received when calibrating an audio processor with balanced I/O.
Last night I was trying to calibrate the MixBuzz compressor I just finished building and got a little confused.
I have two ways to send out a sine wave - I can use the unbalanced, 600ohm output of an old function generator I have, or I can send out a balanced signal from my audio interface, using the signal generator in Pro Tools. I like to measure the signal (outgoing and returning) on my oscilloscope.
So, for instance if I need to send a 1V RMS signal to the device being calibrated I am right in assuming that:
- If I am sending the signal from the balanced audio interface, I should measure with the scope between the hot and cold pins (pin 2 and 3) of the audio interface and adjust until I get the 1V RMS I need? Or, should I ground pin 3 and just take my measurement between pin 2 and ground?
- If I am sending the signal from the unbalanced function generator, I should clip the "hot" lead of the probe to the hot pin of the device's input, and clip the ground lead to the cold pin? What I have just done is place 1V RMS between the + and - inputs, which is what a balanced input senses, yes? Or, should I clip the ground lead of the function generator to Pin 1 (GND) of the input of the device I am calibrating? Wherever I put them, I would also place the oscilloscope probe between the same two places so I can monitor the signal I am giving the device, yes?
AND, when measuring the OUTPUT of the device being calibrated, should I not put the oscilloscope probe between the hot and cold pins of the output XLR and read the amplitude of the device's output that way?
I get worried when performing these sort of tests because each possible combination yields a much different result! I want to make sure I am feeding the correct tone to the device during the calibration procedure. As they say, "Garbage In=Garbage Out." If I am not properly measuring the signal I send in, then I can't trust the signal I am measuring on the output.
[/list]