I am trying to calibrate line in/out on a freshly refurbed/recapped/gone through Sound Workshop Series 40 console. The VCA's have been bypassed with regular audio faders and cap values doubled. Otherwise it's stock.
I was given a process which doesn't make sense to me, but I will summarize thusly in case it may at least guide down a helpful path. Might be worth skipping over all this though....
-----------------
1. Set sig gen to line level (1.23VAC RMS between ground and tip @ 1k).
2. Set a channel to tape in mode (which monitors the line in BEFORE the channel fader), with the tape level pot cranked and pan pot all the way left.
3, Run the line level signal into the tape/line input.
4. Monitor the tape path on the master section control room out with the console master fader all the way up (0 marking).
5. Take note of what the left master VU meter reads (let's say it's +14)....doesn't matter if this is accurate for now. Just trying to establish a level.
6. Switch the channel to listen in line in mode (same exact input source, but now monitoring post fader) and hook the channel direct out up to a VU meter or computer to monitor level (I'm actually using a true RMS DMM and calling 1.23VAC between tip and ground line level/0dB/+4dBu)
7. Now MOVE THE CHANNEL FADER to whatever arbitrary position makes the signal on the left master VU the same as before (so +14 in this example).
8. Now here's where I get lost....we've set the channel fader to some arbitrary position to get +14 on the left master VU. Now we adjust the line trim cal pot to see line level on the DMM monitoring the direct out (which is post fader).....so we've just set the direct out to line level, great.....but only at whatever arbitrary position the fader happened to land on in step 7....assuming that doesn't happen to be 0, when we move the channel fader to the 0 marking, the direct output will now be different from line level/0dB/+4dBu.
9. But let's continue. "Now you can calibrate 0dB on the meter"...I'm guessing this means adjust the LED's to indicate 0dB....but there is another cal pot on the meter which also affects post fader level (TK cal, just regulates level going to the output card which feeds the direct out). So if I adjust this, it'll negate the line level we just set in step 8.
10. Apparently now the channel is calibrated and you move on to calibrating the master section (which also doesn't seem to work) then use this "properly calibrated" channel to verify your master section is correctly calibrated, then you can move on calibrating the rest of the channels in the same fashion, except using the master section VU meters as your line level reference now instead of an external DMM or DAW or VU meter.
------------------------------------
Assuming that doesn't lead anywhere and I need to find a process starting from scratch, here is a link to a PDF with relevant info: https://drive.google.com/file/d/1pHLLHGsSWBg3DNyJGFTvmsqpmszsa_jf/view?usp=sharing
Page one: Block diagram showing basic signal flow in the console.
Page two: The only tiny piece of info I could find about calibration in the precious little Series 40 documentation I have.
Page three: Input module schems (you'll see the tape/line cal pot on the OL gain of the tape/line driver IC)
Page four: Output module schems (the TK cal pot seems to have the same effect as the tape/line cal pot on the channel card).
I know it's a PITA but any help or direction with this is GREATLY appreciated! Thanks fellas!
I was given a process which doesn't make sense to me, but I will summarize thusly in case it may at least guide down a helpful path. Might be worth skipping over all this though....
-----------------
1. Set sig gen to line level (1.23VAC RMS between ground and tip @ 1k).
2. Set a channel to tape in mode (which monitors the line in BEFORE the channel fader), with the tape level pot cranked and pan pot all the way left.
3, Run the line level signal into the tape/line input.
4. Monitor the tape path on the master section control room out with the console master fader all the way up (0 marking).
5. Take note of what the left master VU meter reads (let's say it's +14)....doesn't matter if this is accurate for now. Just trying to establish a level.
6. Switch the channel to listen in line in mode (same exact input source, but now monitoring post fader) and hook the channel direct out up to a VU meter or computer to monitor level (I'm actually using a true RMS DMM and calling 1.23VAC between tip and ground line level/0dB/+4dBu)
7. Now MOVE THE CHANNEL FADER to whatever arbitrary position makes the signal on the left master VU the same as before (so +14 in this example).
8. Now here's where I get lost....we've set the channel fader to some arbitrary position to get +14 on the left master VU. Now we adjust the line trim cal pot to see line level on the DMM monitoring the direct out (which is post fader).....so we've just set the direct out to line level, great.....but only at whatever arbitrary position the fader happened to land on in step 7....assuming that doesn't happen to be 0, when we move the channel fader to the 0 marking, the direct output will now be different from line level/0dB/+4dBu.
9. But let's continue. "Now you can calibrate 0dB on the meter"...I'm guessing this means adjust the LED's to indicate 0dB....but there is another cal pot on the meter which also affects post fader level (TK cal, just regulates level going to the output card which feeds the direct out). So if I adjust this, it'll negate the line level we just set in step 8.
10. Apparently now the channel is calibrated and you move on to calibrating the master section (which also doesn't seem to work) then use this "properly calibrated" channel to verify your master section is correctly calibrated, then you can move on calibrating the rest of the channels in the same fashion, except using the master section VU meters as your line level reference now instead of an external DMM or DAW or VU meter.
------------------------------------
Assuming that doesn't lead anywhere and I need to find a process starting from scratch, here is a link to a PDF with relevant info: https://drive.google.com/file/d/1pHLLHGsSWBg3DNyJGFTvmsqpmszsa_jf/view?usp=sharing
Page one: Block diagram showing basic signal flow in the console.
Page two: The only tiny piece of info I could find about calibration in the precious little Series 40 documentation I have.
Page three: Input module schems (you'll see the tape/line cal pot on the OL gain of the tape/line driver IC)
Page four: Output module schems (the TK cal pot seems to have the same effect as the tape/line cal pot on the channel card).
I know it's a PITA but any help or direction with this is GREATLY appreciated! Thanks fellas!