Properly measuring test tones

GroupDIY Audio Forum

Help Support GroupDIY Audio Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

skidmorebay

Well-known member
Joined
Aug 1, 2011
Messages
134
Funny, I tried to reply to an older post along these lines but got a response recommending I start a new topic because the old one was over 120 days old.

Can anyone help me out with some basic measurement questions? This concerns how to measure the amplitude of a signal being sent or received when calibrating an audio processor with balanced I/O.
Last night I was trying to calibrate the MixBuzz compressor I just finished building and got a little confused.

I have two ways to send out a sine wave - I can use the unbalanced, 600ohm output of an old function generator I have, or I can send out a balanced signal from my audio interface, using the signal generator in Pro Tools. I like to measure the signal (outgoing and returning) on my oscilloscope.

So, for instance if I need to send a 1V RMS signal to the device being calibrated I am right in assuming that:
  • If I am sending the signal from the balanced audio interface, I should measure with the scope between the hot and cold pins (pin 2 and 3) of the audio interface and adjust until I get the 1V RMS I need? Or, should I ground pin 3 and just take my measurement between pin 2 and ground?
  • If I am sending the signal from the unbalanced function generator, I should clip the "hot" lead of the probe to the hot pin of the device's input, and clip the ground lead to the cold pin? What I have just done is place 1V RMS between the + and - inputs, which is what a balanced input senses, yes? Or, should I clip the ground lead of the function generator to Pin 1 (GND) of the input of the device I am calibrating? Wherever I put them, I would also place the oscilloscope probe between the same two places so I can monitor the signal I am giving the device, yes?

AND, when measuring the OUTPUT of the device being calibrated, should I not put the oscilloscope probe between the hot and cold pins of the output XLR and read the amplitude of the device's output that way?

I get worried when performing these sort of tests because each possible combination yields a much different result! I want to make sure I am feeding the correct tone to the device during the calibration procedure. As they say, "Garbage In=Garbage Out." If I am not properly measuring the signal I send in, then I can't trust the signal I am measuring on the output.

[/list]
 
What you have worked out is basically correct. Connect input signals across pins 2 and 3. Measure output signal between pins 2 and 3. If you have truly balanced floating inputs and outputs this will work every time no matter what you connect to the input and output.

The problem is that a lot of 'electronically balanced' inputs and outputs are either not truly balanced or floating or both. This means that when you connect an unbalanced source, like your sig gen, to the input, or an unbalanced measurement device like your scope the the output, you may or may not get the signal level you expect because the unbalanced device probably connects one input or output to ground.

With transformer inputs and outputs this does not matter. They do not care if you ground one of the signals and they work just the same when you do. With electronically balanced ins and outs you can never be sure.

So, my advice to you is to connect your sig gen and scope via transformers so that they are true balanced, floating outputs and inputs. You can use a 600:600 type for your sig gen and a 10K:10K type to interface to your scope. My sig gen has a balanced output but my scope and distortion meter I feed via a 10K:10K Sowter transformer.

Cheers

Ian

Cheers

Ian
 
This thread was really helpfull.
I have a question regarding measuring the response of mic preamps.
given that the usual impedance of a microphone is 200 ohms, if I am using
a audio soundcard to provide a sweep from my mac (fuzzmeasure) should I do something to change the impedance of the
ouput from the audio interface so that it matches the input of a mic
preamp (with transformer input)?

regards
Taz
 
Good practice it to make a simple resistor pad that not only converts the higher lever signal down to microphone level, but presents a low nominal source impedance (150-200 ohms) to the mic preamp input.

Using a 3 resistor pad can help reduce ground potential issues, as they get divided down by the pad too. 

JR
 
Unless your sig gen has a nice output attenuator it is best to pad down its output. You can make up and XLR lead with a 200 ohm resistor across pins 2 and 3 at the pre end and 10K resistors in series with pins 2 and 3. This will give very close to 40dB of attenuation and provide a 200 ohm source to the mic pre.

Cheers

Ian
 
Thanks for all the info here. For some reason I stopped receiving notifications. I had thought that no one replied to this until I checked it this morning!

I really like the idea of using transformers to interface with equipment I am testing. I'll give that a try.

Here's another clue for people dealing with these issues - I found this on the web in a manual for an Agilent function generator. I noticed the same issue recently- that when measuring the output of my generator with a scope I was getting twice the voltage I expected...

"As the default setting, Agilent function
generators display voltage as though
it were terminated into a 50-ohm
load. When a high-impedance device
such as an oscilloscope is used to
measure the output of the function
generator, the waveform appears to
be twice the voltage shown on the
display of the function generator.
To remedy the discrepancy, you may
be able to change the oscilloscope’s
input impedance from standard high
impedance to a 50-ohm termination
(not all oscilloscopes allow you to
do this). Another solution is to add
a 50-ohm feedthrough to the oscillo-
scope’s input BNC."

thanks!
 
Back
Top