I just finished testing and coming up with a scale and resistor value for my Behringer meter. Since the scale that came with the B* meter has 12 divisions on it, I decided to make each division theoretically represent -1dB of GR. As a result, the -6dB point is striaght up... this was the point to which I calibrated my meter. With no GR, my output with a 1k input signal was 1VAC (RMS since I am using my Fluke 111). Then I went through each division and recorded the voltage... then calculated the amount of dB by using the formula 20log(VwithGR/VnoGR).

As previously stated, this is not a linear meter, but the results do provide me with a very good idea where the meter is accurate and where it is not.

I'll provide results in both output voltage and dB.

Tick 1: 0.859VAC, -1.32dB

Tick 2: 0.748VAC, -2.52dB

Tick 3: 0.675VAC, -3.41dB

Tick 4: 0.697VAC, -4.34dB

Tick 5: 0.548VAC, -5.22dB

Tick 6: 0.499VAC, -6.04dB

Tick 7: 0.445VAC, -7.03dB

Tick 8: 0.390VAC, -8.18dB

Tick 9: 0.333VAC, -9.55dB

Tick 10: 0.270VAC, -11.37dB

Tick 11: 0.217VAC, -13.27dB

Tick 12: 0.161VAC, -15.86dB

From these results, you can see where the meter behaves most linearly and where it is closest to the actual theoretical values. Notice that once the meter deflects to tick 9+, it's very non-linear. It appears to be most linear between ticks 3 and 8, and most accurate between ticks 5 and 8, which is where I will probably use the meter most of the time.

So the value of the resistor for a 12dB scale (theoretical), with 6dB GR accurately deflecting at 50%, is 3.3k. I had the 2k in the Main PCB, and a pot trimmed to 1.3k in series with the meters V-.

I hope this is useful and might clear up some confusion.