Simpson VU Ballistics Variation

GroupDIY Audio Forum

Help Support GroupDIY Audio Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Actually one more thing - attached is a photo of the previous vu buffer circuit and supply. Can you all tell from this what this is? I mean is the simple answer a PPM “VU” buffer circuit?

Thanks again!
Mark
 

Attachments

  • 0A141F4F-7AF1-4190-B64F-439E0C9BB5B9.jpeg
    0A141F4F-7AF1-4190-B64F-439E0C9BB5B9.jpeg
    132.6 KB
when did the switch from DC to AC meter occur then - when it switched to logarithmic behavior
No , although there are AC meter movements (moving iron type) , a vu meter uses a moving coil direct current movement and a fullwave rectifier. I use the term 'dc' for those movements that are specifically designed for dc.

The meter,s movement is linear not logarithmic (it move in direct proportion to the current) the scale is logarithmic , thats why the number are bigger and closer together at the bottom.
 
EDIT: Meant to say thank you for the previous messages DaftFader! Good to know.

Ok folks, one more series of questions:

I’m trying to calibrate these. Given that I’m running hot and ground from an XLR cable, when I run +4dBu/1.223v, it reads low (half signal due to unbalancing). Is this to be expected, and thus I would just calibrate to my halved cal signal? Or am I supposed to connect these balanced somehow?

And on that note, strangely, when I run that level of 1.223v out of my DAW, versus out of my NTI Minirator MR2, the MR2 reads even lower than the DAW out… is this an impedance concept? I believe the impedance of the MR2 is 200 ohms where perhaps the DAW is lower?

Thanks all for your help again!
Mark
 
Last edited:
To add to that and cut to the chase, I’m wanting 0dBVU to be calibratable, so I can make -7dbm (I think m, maybe u?) at 0 VU. Would I use a potentiometer? Any specifics would be really appreciated.
There's a rectifier in a VU-meter; because of this, signals below a certain level are completely ignored. Even removing the series resistance may not allow full deviation with only -7 dBu.
Tampering with the series resistor results in wrong ballistics.
You must add an electronic gain card with 11dB of gain.
 
There's a rectifier in a VU-meter; because of this, signals below a certain level are completely ignored. Even removing the series resistance may not allow full deviation with only -7 dBu.
Tampering with the series resistor results in wrong ballistics.
You must add an electronic gain card with 11dB of gain.
Thank you for this, abbey road d enfer. Understood, and I'll work on figuring that out next.

Just for the sake of clarity, if I'm using this largely to have a feel for overall loudness in a certain meaningful way, will the affect that changing the resistance has on ballistics be a significant problem? Are you saying it may jump from not registering signal at all to registering signal in an unusable way? Apologies for my lack of knowledge.

Best,
MG
 
Jay McKnight was one of the brilliant engineers at Ampex, then founded the MRL company making standard alignment test tapes. He is very smart about standards in audio and wrote the articles I attached. I also have many of the early references that Jay cites in his paper.

Bottom lines in year 2022.....Weston no longer makes VU meters. Simpson, Hoyt...maybe? Sifam....sorta. A "VU" designation on the scale means nothing.

BUT...if you have a "real" olde school meter and are driving it from the output of an opamp buffer, then 3k9 Ohms is the "correct" series resistance to "ensure" correct ballistics.

Good luck....<g>.

Bri
 

Attachments

  • mcknight_q&a-on-the-svi-6.pdf
    45.5 KB
  • McKnight_svimeasurement.pdf
    48.4 KB
Given that I’m running hot and ground from an XLR cable, when I run +4dBu/1.223v, it reads low (half signal due to unbalancing). Is this to be expected, and thus I would just calibrate to my halved cal signal? Or am I supposed to connect these balanced somehow?
Never assume that the level between one leg and ground is exactly half of the full balanced voltage.
In particular, a transformer output will result in very low signal in these conditions. That's the characteristics of a floating output.
Many xfmr-less outputs do not deliver symmetrical voltages on the hot and cold legs.
The only ones that do have a very low common-mode impedance, which results in poor Common Mode Rejection Ratio, which is not what you want for clean operation..
The more sophisticated balanced line drivers (THAT1046/1646, SSM2142...) are much better in that respect but still may develop unequal voltages on the hot and cold legs.
The VU-meter should be connected across the full signal, between hot and cold.
Be aware that unbuffered VU-meters induce distortion (because of the rectifiers inside). The higher the source impedance, the higher distortion.
The standard mentions a 7.5kohm for the meter+resistor, applied to a <600 ohms line, which typically resulted in <0.1% THD, which was at the time considered excellent, particularly in comparison with tape distortion, that was often pushed higher than the 3% target defined by standard tape calibration.
https://groupdiy.com/threads/vu-meter-add-distortion.2763/https://gearspace.com/board/showpost.php?p=12421160&postcount=11Remember the latter case was with Neve output stages with an impedance of about 40 ohms IIRC. Would be about 10x higher with 600 ohms source.
 
Last edited:
Never assume that the level between one leg and ground is exactly half of the full balanced voltage.
In particular, a transformer output will result in very low signal in these conditions. That's the characteristics of a floating output.
Many xfmr-less outputs do not deliver symmetrical voltages on the hot and cold legs.
The only ones that do have a very low common-mode impedance, which results in poor Common Mode Rejection Ratio, which is not what you want for clean operation..
The more sophisticated balanced line drivers (THAT1046/1646, SSM2142...) are much better in that respect but still may develop unequal voltages on the hot and cold legs.
The VU-meter should be connected across the full signal, between hot and cold.
Be aware that unbuffered VU-meters induce distortion (because of the rectifiers inside). The higher the source impedance, the higher distortion.
The standard mentions a 7.5kohm for the meter+resistor, applied to a <600 ohms line, which typically resulted in <0.1% THD, which was at the time considered excellent, particularly in comparison with tape distortion, that was often pushed higher than the 3% target defined by standard tape calibration.
https://groupdiy.com/threads/vu-meter-add-distortion.2763/https://gearspace.com/board/showpost.php?p=12421160&postcount=11Remember the latter case was with Neve output stages with an impedance of about 40 ohms IIRC. Would be about 10x higher with 600 ohms source.
Excellent info and heard - thank you again, abbey.

So just to confirm, the series 3k6 resistor (Brian Roth said 3k9 but you said 3k6 was the standard, yes?) can be in series on either hot or cold leg? And then ground is not connected? Just to let you know I'm paying attention, would it be more/most accurate to measure the resistance of the meter and then add the necessary resistor to achieve 7k5 ohms as you mention as opposed to this idea of some value being fixed as correct?

Lastly, understood regarding distortion. This is a dedicated output not running on any program lines, so I'm good to go in that regard.

Thanks!
Mark
 
So just to confirm, the series 3k6 resistor (Brian Roth said 3k9 but you said 3k6 was the standard, yes?)
Actually 3.6k is for a 600r line loaded with 600r (matching), resulting in 300r effective. For a 600r line loaded with 10k (bridging), te resistor recommended by the standard would be 3.6k. Used with a modern solid-state output, with about 50-100r source impedance, the resistor should be 4.1k. The difference is negligible, though.
can be in series on either hot or cold leg?
Yes.
And then ground is not connected?
Correct.
Just to let you know I'm paying attention, would it be more/most accurate to measure the resistance of the meter and then add the necessary resistor to achieve 7k5 ohms as you mention as opposed to this idea of some value being fixed as correct?
That would be OK if you measured the actual impedance under nominal conditions, i.e. with about 1.2V rms. With a standard multimeter, the rectifier would false the measurement.
 
Huh? The standard is 3.6 kohms, and should not change whether you have a high z or low z output. If you change the resistor you change the calibration on the meter.
 
It's true. However that's how the standard works.
Thanks again to both of you. I’m wondering if 3.9k (which I have in my shop but not 3.6k yet) would be a good medium ground between the standard of 3.6k but brought up due to the modern low Z output.

I may just try, but if it seems like bad practice, I can wait for a 3.6k in a few days. Thoughts?
 
I may just try, but if it seems like bad practice, I can wait for a 3.6k in a few days. Thoughts?
According to standards, 3k9 would be more correct than 3k6 for ballistics, in the context of low-Z lines.
However, since meters are not adjustable (their sensitivity is factory-set), the only correct value is that which gives the correct deviation with the reference voltage.
That's the reason why trimmers are often used here.
The ideal system is one where the resistor is adjusted for ballistics and sensitivity is adjusted with a variable-gain buffer;
Considering that VU-meters are actually very limited in terms of accuracy, except for calibration purposes, I wouldn't seat too much and use the resistor that gives the correct indication for the reference level.
 
According to standards, 3k9 would be more correct than 3k6 for ballistics, in the context of low-Z lines.
However, since meters are not adjustable (their sensitivity is factory-set), the only correct value is that which gives the correct deviation with the reference voltage.
That's the reason why trimmers are often used here.
The ideal system is one where the resistor is adjusted for ballistics and sensitivity is adjusted with a variable-gain buffer;
Considering that VU-meters are actually very limited in terms of accuracy, except for calibration purposes, I wouldn't seat too much and use the resistor that gives the correct indication for the reference level.
Thanks again, abbey. I'm now running into something that's really confusing me.

I wired each meter for hot and cold, removed the ground and the potentiometer on the hot leg (set to 3k6), and wired in a 3k9 resistor on only the hot leg. Maybe that's where I'm messing up and it should be a resistor on each leg?

The strange thing is when I run my reference signal and try to calibrate the output of my DAW so that +4dbu/0vu is actually +7dbu/0vu, 1k completely smashes the meter to clipping. So then I lower the output of the DAW something like 12 db to get it to 0 VU. But then playing music through that same calibration level, the meters are registering very low. I hope that makes sense but to be clear:

-9dB FS @1k out of my DAW, checked with testing equipment = +7 dBu at the meters input XLR
The meter then is pegged audibly.
Lower the output of my DAW so that the confirmed +7 dBu signal is registering 0 VU.
Master level program is only at the very lowest level of the meter.

In the previous iteration, at least things were making some relative sense between calibration. Could you let me know thoughts? Thanks for the patience as I work through this - all really valuable lessons I won't forget.

Best,
MG
 
I wired each meter for hot and cold, removed the ground and the potentiometer on the hot leg (set to 3k6), and wired in a 3k9 resistor on only the hot leg. Maybe that's where I'm messing up and it should be a resistor on each leg?
Definitely no. Only one is needed.
The strange thing is when I run my reference signal and try to calibrate the output of my DAW so that +4dbu/0vu is actually +7dbu/0vu,
I don't get it. It's either +4 or +7, but not both. are you referring to different meters?
1k completely smashes the meter to clipping.
Actually +7dBu should result in full deviation, but not fully pegging.
So then I lower the output of the DAW something like 12 db to get it to 0 VU. But then playing music through that same calibration level, the meters are registering very low. I hope that makes sense but to be clear:

-9dB FS @1k out of my DAW, checked with testing equipment = +7 dBu at the meters input XLR
The meter then is pegged audibly.
Audibly? What do you hear?
Lower the output of my DAW so that the confirmed +7 dBu signal is registering 0 VU.
How would you do that? +7dBu should put the VU-meter at the +3 mark.
 
Definitely no. Only one is needed.
Ok great, thank you.

I had a lengthy reply written and unfortunately/fortunately found the error of my ways as I was finishing.

To cut to the chase, I was being a bonehead - and my apologies. What had happened was that my test equipment was monitoring two outputs summed together, so I was getting double signal only when calibrating to 1k, and then a much lower signal when passing program material.

Thank you sincerely for your help - all across this awesome board - and I think I may be good to go here though I don't want to jinx it!

Best,
MG
 
Hello abbey and all,

I'm sad to say it but the saga continues! So I have been running these off of a 3.9k resistor with what seemed like minimal differences presenting during calibration, which I adjusted with the meter zero on the front until today. Today I received some nice 3.6k resistors and installed them, thinking these would be exact matches for the specs. I'm not sure what changed so drastically, but somehow the two meters are now reading very differently, and I can't even get them in range with the front trims.

For some reason, they're like 3 dB VU off from eachother.

So I checked the impedance of the VU meters themselves. The one was off by 890 ohms unloaded/disconnected. So I added 890 ohms in series with the 3.6k, confirmed the measurement, tried again, and still the same differences are presenting. The higher reading meters is like 12.6k, and the lower reading meter is 11.7k.

I'm wondering first of all, what the f is going on. Secondly, could I have damaged that right meter when I hit it with "double" (really loud signal) accidentally when testing last. As I mentioned, the needle pinned so hard that it was making noise on only the right meter which is reading the lower impedance accross.

Could I/should I potentially take it apart carefully and see if I damaged the rectifier?

Any help would be really appreciated, as at this point I'm extremely frustrated. I'm wondering if I should just begin to backtrack to how I had it previously set up and see if I can get it to work. When I had the one half of the meter connected to ground, they were perfectly matched from the get go.

Thanks very much,
MG
 
Back
Top