Hi Ethan,
Okay, to get to the root of this problem, you have to roll back a few years...
Traditionally (going back to the telephone industry) all connections were made between equipment with "600R lines". This was based on the theory of supplying maximum power from device to device- maths shows us that maximum power transfer is obtained when the source impedance = load impedance. Broadcasters and studios needed a meter to measure the levels present in these lines. A standard reference level of 0dBm was decided to be the power of 1mW in a 600R sourced and terminated circuit. The suffix dBm stands for this 600R standard.
0dBm is not 0VU on a VU meter. 1mW in 600R works out as 0.775V, which is a bit low for the average levels found in these lines. So the standard for the VU meter was arranged so that 0VU = +4dBm, which is 1.23V rms.
Okay, fast-forward to the present day. For those of us into "classic" pro audio gear, we still come across "true" 600R inputs and outputs, which is why we still have gear which is specified as being able to drive at least at 600R load to >+20dBm. But the reality is that modern practice is concerned with maximum voltage transfer, even in balanced systems. This equates to very low source impedances of <100R, and high load impedances of >10k differential.
But the "0VU" measurement on VU meters is still required to make sense, so the measurement of "dBu" has come into practice which means across an unloaded (or negligibly loaded compared to 600R!) line. In practice this means that 0dBm = 0dBu. So 0VU on a VU meter is still 1.23V rms whether seeing a >10k load, or a 600R load. There will be a difference of a few dB with some equipment, depending on its source impedance, but for recording music in a studio it becomes much less scientific.
Here we run into another problem. With tape machines, we could set our operating levels to match our console outputs. By setting the input and output levels of the machine, everything ran smoothly. But with modern digital systems, the A/D and D/A converters have an absolute headroom. Operating level can be adjusted in as much as "+4dBu" or "-10dBv" as input and output levels, but a headroom of >+20dBu can be handled by these interfaces. To "make the most" of the digital headroom (i.e. use the most bits) people run very high levels into these converters- which can mean VU meter needle-pegging way beyond traditional line levels.
Also, we like to drive our old gear to maximise output and input transformer saturation, to bring out cool tone. Again, the VU meter can't make sense of this.
Two solutions:
1. Add a pad to your VU meter. If this is switched in sensible ranges, you can shift the level of "0VU". So a -6dB pad between line and meter allows 0VU to be +10dBu. The old Altec 1576a had a meter range shift-switch. This allowed 0VU to be +4dBm, +8dBm, +12dBm (IIRC) Also it had a "meter off" switch if you go really over-range :green: I know Steve Albini has a "VU-kill" switch added to his 350's to stop "needle chatter" when you send a huge level out of those beasts.
2. Adjust the meter drive circuit (assuming and active VU meter) to give you an operating level of 0VU that suits your recording gear and setup. Obviously this is no use if you're running a big commercial facility, where visiting engineers need to know exactly what levels they're dealing with. But most project and home studios can have an operating level "all of their own" to suit your recording technique and recording front end.
I know NYDave posted a neat VU meter pad somewhere. I use various gained-up and padded-down VU meters for my own testing, and this can be useful when you face different unknown level situations.
Hope this is useful!
:thumb:
Mark