How do you calibrate a VU meter?

GroupDIY Audio Forum

Help Support GroupDIY Audio Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Ethan

Administrator
Admin
Moderator
Joined
Jun 3, 2004
Messages
1,602
Location
DC
This might seem like a funny question, but here it goes...

For example: If I have a VU meter at the output of a mic pre would I have the VU meter read an absolute 0dBu when the output is .775Vrms, even if I had say 20dBu of 'headroom' and the meter was pegging full red at +6? Or would one move the 0dBu point so the meter would be more representative of how much headroom until clipping.
Thanks.
 
Hi Ethan,

Okay, to get to the root of this problem, you have to roll back a few years...

Traditionally (going back to the telephone industry) all connections were made between equipment with "600R lines". This was based on the theory of supplying maximum power from device to device- maths shows us that maximum power transfer is obtained when the source impedance = load impedance. Broadcasters and studios needed a meter to measure the levels present in these lines. A standard reference level of 0dBm was decided to be the power of 1mW in a 600R sourced and terminated circuit. The suffix dBm stands for this 600R standard.

0dBm is not 0VU on a VU meter. 1mW in 600R works out as 0.775V, which is a bit low for the average levels found in these lines. So the standard for the VU meter was arranged so that 0VU = +4dBm, which is 1.23V rms.

Okay, fast-forward to the present day. For those of us into "classic" pro audio gear, we still come across "true" 600R inputs and outputs, which is why we still have gear which is specified as being able to drive at least at 600R load to >+20dBm. But the reality is that modern practice is concerned with maximum voltage transfer, even in balanced systems. This equates to very low source impedances of <100R, and high load impedances of >10k differential.

But the "0VU" measurement on VU meters is still required to make sense, so the measurement of "dBu" has come into practice which means across an unloaded (or negligibly loaded compared to 600R!) line. In practice this means that 0dBm = 0dBu. So 0VU on a VU meter is still 1.23V rms whether seeing a >10k load, or a 600R load. There will be a difference of a few dB with some equipment, depending on its source impedance, but for recording music in a studio it becomes much less scientific.

Here we run into another problem. With tape machines, we could set our operating levels to match our console outputs. By setting the input and output levels of the machine, everything ran smoothly. But with modern digital systems, the A/D and D/A converters have an absolute headroom. Operating level can be adjusted in as much as "+4dBu" or "-10dBv" as input and output levels, but a headroom of >+20dBu can be handled by these interfaces. To "make the most" of the digital headroom (i.e. use the most bits) people run very high levels into these converters- which can mean VU meter needle-pegging way beyond traditional line levels.

Also, we like to drive our old gear to maximise output and input transformer saturation, to bring out cool tone. Again, the VU meter can't make sense of this.

Two solutions:

1. Add a pad to your VU meter. If this is switched in sensible ranges, you can shift the level of "0VU". So a -6dB pad between line and meter allows 0VU to be +10dBu. The old Altec 1576a had a meter range shift-switch. This allowed 0VU to be +4dBm, +8dBm, +12dBm (IIRC) Also it had a "meter off" switch if you go really over-range :green: I know Steve Albini has a "VU-kill" switch added to his 350's to stop "needle chatter" when you send a huge level out of those beasts.

2. Adjust the meter drive circuit (assuming and active VU meter) to give you an operating level of 0VU that suits your recording gear and setup. Obviously this is no use if you're running a big commercial facility, where visiting engineers need to know exactly what levels they're dealing with. But most project and home studios can have an operating level "all of their own" to suit your recording technique and recording front end.

I know NYDave posted a neat VU meter pad somewhere. I use various gained-up and padded-down VU meters for my own testing, and this can be useful when you face different unknown level situations.

Hope this is useful!

:thumb:

Mark
 
Ethan,

A standard VU meter is calibrated to read 0VU when fed a level of +4dBU (1.228VRMS) through a series resistor of about 3600 ohms. The nominal impedance of the meter itself is 3900 ohms; so together with the series resistor, the total load is 7500 ohms (nominal). This is high enough to be considered a "bridging" load under the old 600-ohm way of thinking.

The VU meter will indicate 0VU at .775VRMS with no series resistor, but it should not be operated this way because--depending on the output impedance of the source being monitored-- the rectifiers in the meter can distort the signal.

The standard way of wiring up a VU meter in the old days, when meters were bridged right across the lines rather than being buffered, was to put a switchable bridged-T attenuator (called a "range extender") between the 3600-ohm series resistor and the meter. The first (no-loss) position on the attenuator would indicate 0VU = +4dBU, and go up from there. And yes, the reason for this is so that you could monitor the upper part of your dynamic range. It's all just an approximation anyway since the VU meter is an average-responding instrument and won't capture peaks. But its response does happen to correlate nicely with how we perceive sound, which is why it continues to be useful after more than 60 years.

More info in this thread.
 
...bear in mind as well that a mechanical VU meter has quite a slow response time due to mechanical inertia. Compare your VU meter reading to the peak-reading meters in your DAW. Especially with a kit- the peak of a snare hit which can raise a peak meter to near max, will be missed by a VU meter! This mattered much less in the Analog Days, because tape was a bit more relaxed to this sort of level situation. A/D's are not so forgiving! So a lot of this discussion becomes level-dependent. You soon get to learn your system and how to interpret the VU meters result. In Europe, the PPM was used a lot more than the VU because it allowed broadcasters to see the maximum modulation level that they could send to the transmitter. Somewhere there's a really good VU/PPM various operating levels comparison chart. I'll see if I can find it.

Mark
 
I carry a vu meter with a 3k9 resistor and a variety of connectors - (1/4", xlr, 1/4"GPO, Bantam and combining/splitting cables so i can parallel the output of someone's Mic-pre) in my bag. When i go to someone's house, to record a vocal/whatever, i can show them why they don't need to run the input of their computer into the red! I am old school, and grew up using tape machines. I believe in recording most things so that when you bring a tape up the board, all the faders are set at @0dB. Best fader resolution with automation and reasonable signal to noise. using a VU with most things (obviously not a hi Hat!) even a cretin cannot overload their computer, and most importantly, not over-drive their pre-amp (the ultimate taboo for me, unless it is for a deliberate effect, of coarse.). Cheep mikes and Mike-pre's are best run at conservative levels, and this is easiest demonstrated with a VU meter. Once you understand what is really going on, THEN you can try to bend the meter-pins . . .

Andy P

ps -I am tired of bouncing clipped, nasty, sibilent vocals so that they can interface with my analogue world!!!!!
 
> If I have a VU meter at the output of a mic pre would I have the VU meter read an absolute 0dBu when the output is .775Vrms, even if I had say 20dBu of 'headroom' and the meter was pegging full red at +6? Or would one move the 0dBu point so the meter would be more representative of how much headroom until clipping.

What Mark said. The second time.

A VU meter is a slow instrument. The basic idea is you set it to read "zero" about 10dB or 20dB below clipping level, because it will NOT read those fast transients. The "exact" difference between real peaks and the reading on a slow meter is statistical: you get a lot of peaks 6dB over, very-very few peaks 25dB over.

In AM radio, keeping signal well above noise was critical so we used only about 10dB "lead". With a 100 Watt transmitter, the VU meter would show "zero" at about the 10 Watt level. When running with the VU meter just touching "zero", there was about 1% peak-clipping. This is not very obvious, certainly in an AM radio situation. Note that through most of the studio, headroom was more like 20dB (0VU = +8dBm on +28dBm amplifiers), only the meter on that that limited-power transmitter would be worked with only 10dB lead.

If you are not stuck in the 40dB S/N world of AM reception, and especially if you run through several stages of hard-clip amplifiers and listen over and over, you want more than 10dB lead. 14dB to 20dB are accepted figures. Depends a lot on how hungry you are for S/N, how allergic you are to clipping.

Also unless you have a specific reason to do otherwise, use Standard Level. In all modern work this is "+4dBu" or 0VU=1.24VRMS. If your amplifier can make +20dBu (7.7VRMS) on peaks, this gives a nice 16dB lead. A speech/music signal that bops the slow VU meter to 0VU will clip maybe once a day. If you mostly work around -6VU then you will get a clip about once a year (remember, statistical bell-curve and 22dB lead on a VU meter is way out on the tail of the bell).

If you use one of the older mike-amps rated just +18dBm and not too clean at that, and need as clean as possible, you want to never see the needle touch 0VU. If it is bopping above -10VU, you have good signal, and you need to get more gain further down the line if you need more level.
 

Latest posts

Back
Top