No , although there are AC meter movements (moving iron type) , a vu meter uses a moving coil direct current movement and a fullwave rectifier. I use the term 'dc' for those movements that are specifically designed for dc.when did the switch from DC to AC meter occur then - when it switched to logarithmic behavior
That's what the standard calls for. Not only for correct scale, but also for correct damping (read ballistics).For some reason I used a 3k6 (may have been specified in the Sifam data? - can't remember)
There's a rectifier in a VU-meter; because of this, signals below a certain level are completely ignored. Even removing the series resistance may not allow full deviation with only -7 dBu.To add to that and cut to the chase, I’m wanting 0dBVU to be calibratable, so I can make -7dbm (I think m, maybe u?) at 0 VU. Would I use a potentiometer? Any specifics would be really appreciated.
Thank you for this, abbey road d enfer. Understood, and I'll work on figuring that out next.There's a rectifier in a VU-meter; because of this, signals below a certain level are completely ignored. Even removing the series resistance may not allow full deviation with only -7 dBu.
Tampering with the series resistor results in wrong ballistics.
You must add an electronic gain card with 11dB of gain.
That's what I mean. And the ballistics may become crazy.Are you saying it may jump from not registering signal at all to registering signal in an unusable way? Apologies for my lack of knowledge.
Never assume that the level between one leg and ground is exactly half of the full balanced voltage.Given that I’m running hot and ground from an XLR cable, when I run +4dBu/1.223v, it reads low (half signal due to unbalancing). Is this to be expected, and thus I would just calibrate to my halved cal signal? Or am I supposed to connect these balanced somehow?
Excellent info and heard - thank you again, abbey.Never assume that the level between one leg and ground is exactly half of the full balanced voltage.
In particular, a transformer output will result in very low signal in these conditions. That's the characteristics of a floating output.
Many xfmr-less outputs do not deliver symmetrical voltages on the hot and cold legs.
The only ones that do have a very low common-mode impedance, which results in poor Common Mode Rejection Ratio, which is not what you want for clean operation..
The more sophisticated balanced line drivers (THAT1046/1646, SSM2142...) are much better in that respect but still may develop unequal voltages on the hot and cold legs.
The VU-meter should be connected across the full signal, between hot and cold.
Be aware that unbuffered VU-meters induce distortion (because of the rectifiers inside). The higher the source impedance, the higher distortion.
The standard mentions a 7.5kohm for the meter+resistor, applied to a <600 ohms line, which typically resulted in <0.1% THD, which was at the time considered excellent, particularly in comparison with tape distortion, that was often pushed higher than the 3% target defined by standard tape calibration.
https://groupdiy.com/threads/vu-meter-add-distortion.2763/https://gearspace.com/board/showpost.php?p=12421160&postcount=11Remember the latter case was with Neve output stages with an impedance of about 40 ohms IIRC. Would be about 10x higher with 600 ohms source.
Actually 3.6k is for a 600r line loaded with 600r (matching), resulting in 300r effective. For a 600r line loaded with 10k (bridging), te resistor recommended by the standard would be 3.6k. Used with a modern solid-state output, with about 50-100r source impedance, the resistor should be 4.1k. The difference is negligible, though.So just to confirm, the series 3k6 resistor (Brian Roth said 3k9 but you said 3k6 was the standard, yes?)
Yes.can be in series on either hot or cold leg?
Correct.And then ground is not connected?
That would be OK if you measured the actual impedance under nominal conditions, i.e. with about 1.2V rms. With a standard multimeter, the rectifier would false the measurement.Just to let you know I'm paying attention, would it be more/most accurate to measure the resistance of the meter and then add the necessary resistor to achieve 7k5 ohms as you mention as opposed to this idea of some value being fixed as correct?
It's true. However that's how the standard works.Huh? The standard is 3.6 kohms, and should not change whether you have a high z or low z output. If you change the resistor you change the calibration on the meter.
Thanks again to both of you. I’m wondering if 3.9k (which I have in my shop but not 3.6k yet) would be a good medium ground between the standard of 3.6k but brought up due to the modern low Z output.It's true. However that's how the standard works.
According to standards, 3k9 would be more correct than 3k6 for ballistics, in the context of low-Z lines.I may just try, but if it seems like bad practice, I can wait for a 3.6k in a few days. Thoughts?
Thanks again, abbey. I'm now running into something that's really confusing me.According to standards, 3k9 would be more correct than 3k6 for ballistics, in the context of low-Z lines.
However, since meters are not adjustable (their sensitivity is factory-set), the only correct value is that which gives the correct deviation with the reference voltage.
That's the reason why trimmers are often used here.
The ideal system is one where the resistor is adjusted for ballistics and sensitivity is adjusted with a variable-gain buffer;
Considering that VU-meters are actually very limited in terms of accuracy, except for calibration purposes, I wouldn't seat too much and use the resistor that gives the correct indication for the reference level.
Definitely no. Only one is needed.I wired each meter for hot and cold, removed the ground and the potentiometer on the hot leg (set to 3k6), and wired in a 3k9 resistor on only the hot leg. Maybe that's where I'm messing up and it should be a resistor on each leg?
I don't get it. It's either +4 or +7, but not both. are you referring to different meters?The strange thing is when I run my reference signal and try to calibrate the output of my DAW so that +4dbu/0vu is actually +7dbu/0vu,
Actually +7dBu should result in full deviation, but not fully pegging.1k completely smashes the meter to clipping.
Audibly? What do you hear?So then I lower the output of the DAW something like 12 db to get it to 0 VU. But then playing music through that same calibration level, the meters are registering very low. I hope that makes sense but to be clear:
-9dB FS @1k out of my DAW, checked with testing equipment = +7 dBu at the meters input XLR
The meter then is pegged audibly.
How would you do that? +7dBu should put the VU-meter at the +3 mark.Lower the output of my DAW so that the confirmed +7 dBu signal is registering 0 VU.
Ok great, thank you.Definitely no. Only one is needed.
Enter your email address to join: