ron_swanson
Well-known member
Hello,
I've just recently finished a stereo compressor build that I'm now in the middle of calibrating. I did a very rough, ball-park calibration run through as a start to see if all seems to be working as expected - switch actions, meter activity, ball-park measurements, buzzes, noise, etc. following a provided calibration document. All looks / sounds good so far...
Now that I'm trying zero in the calibration more precisely, I'm finding an issue that maybe I've always had when calibrating previous builds, but never appeared obvious or I mis-understood.
My electronics bench is not close by my studio DAW, so I've been using a stand alone bench signal generator that only offers a 20V max range that I now see may be a problem in how I've been applying. Hoping someone can clarify in case I'm completely messing up my calibration steps in general.
Usually when calibrating, I'll be asked to set an input and / or measure an output signal of 0.775V (0dBu) which has seemed to do the job. When using the bench signal generator, 0.775V (0dBu) seems to require something around a 2.1V setting on the generator in order to read 0.775V AC on my DMM. Seems to have been adequate so far on previous builds.
On this build though, one of the calibration steps asks that I send a 1kHz 7.745V (+20dBu) from my signal generator. My bench generator will only provide me with ~7VAC measured on my DMM at it's 20V max setting. Close, but not 7.745V as desired.
So, I moved over to my DAW (ProTools HD + I/O16) only to find its max setting only offers 0.775V (0dBu) using the built in signal generator.
With this in mind, do I have my dB, dBu, dBFS, dBV, to V mixed up in my head and / or on my equipment? If not, is my bench signal generator not up to the task requiring something more robust or maybe I just need to search for some alternate signal output setting on my generator that I've overlooked previously?
Thanks in advance for any help here!
Cheers,
Greg
I've just recently finished a stereo compressor build that I'm now in the middle of calibrating. I did a very rough, ball-park calibration run through as a start to see if all seems to be working as expected - switch actions, meter activity, ball-park measurements, buzzes, noise, etc. following a provided calibration document. All looks / sounds good so far...
Now that I'm trying zero in the calibration more precisely, I'm finding an issue that maybe I've always had when calibrating previous builds, but never appeared obvious or I mis-understood.
My electronics bench is not close by my studio DAW, so I've been using a stand alone bench signal generator that only offers a 20V max range that I now see may be a problem in how I've been applying. Hoping someone can clarify in case I'm completely messing up my calibration steps in general.
Usually when calibrating, I'll be asked to set an input and / or measure an output signal of 0.775V (0dBu) which has seemed to do the job. When using the bench signal generator, 0.775V (0dBu) seems to require something around a 2.1V setting on the generator in order to read 0.775V AC on my DMM. Seems to have been adequate so far on previous builds.
On this build though, one of the calibration steps asks that I send a 1kHz 7.745V (+20dBu) from my signal generator. My bench generator will only provide me with ~7VAC measured on my DMM at it's 20V max setting. Close, but not 7.745V as desired.
So, I moved over to my DAW (ProTools HD + I/O16) only to find its max setting only offers 0.775V (0dBu) using the built in signal generator.
With this in mind, do I have my dB, dBu, dBFS, dBV, to V mixed up in my head and / or on my equipment? If not, is my bench signal generator not up to the task requiring something more robust or maybe I just need to search for some alternate signal output setting on my generator that I've overlooked previously?
Thanks in advance for any help here!
Cheers,
Greg