I wouldn't use that meter since it relays on the linearity of the FET to measure it accurate, at least isn't the usual way to do it, just using a db converter measuring the bias of the FET, how do you do zero calibration there? The usual way is to duplicate the gain cell and have a DC being divided by the same divider than the signal, adjusting the gain of that signal you get tracking, adjusting the DC level you get the 0dB calibration.
Here you don't have 0dB calibration, so with a different bias at the get you will get different 0dB readings, but that's probably ok but I wouldn't trust if I'm looking for precision on the meter. Also it doesn't allow you to use it as a level meter for I/O as it's used there, which is probably very useful. I don't know If I really want meters, in the case I have them I would put a switch to disable them, it's well proven that our eyes trick our ears, and maybe you like how it sounds but you see there is too much or too many compression and you touch it just because. I'm fine with clip indicators, maybe compression indicators showing the threshold is already reached, but I wouldn't go further, at least have the option to disable them. Measurements can be useful time to time, to know your signal levels aren't all over the place and you are giving away all of your dynamic range, or you are just about to clip and mess your mix, but this a philosophical question way out of this topic.
You could probably replace the meter IC with other IC, I don't see the problem there, may sure the main specs are the same, many times you can swap ICs for your desired range and scale.
JS