> would it work....
What? Force the output to zero VDC? Sure. Or at least to mV, which is close enough.
> would it work reliably.
It "fails" miserably on cold-start. The tubes have no control when cold, so for the first 10 seconds strange things happen. I think in this ploy the FET will pull full-scale DC current through the core.
The chip injects noise. At this low gain, who cares? But in the general case, you want attenuation between servo and input. The worst-case tube offset is maybe 1V, maybe 0.1V aged and trimmed. The opamp can swing +/-10V. So you can afford 10:1 or 50:1 attenuation between the sand's flaws and the grid. (You have 6:1, a good start, but be bold.) That's loss in the servo loop, but even dime opamps have DC gain of a million, and you only need total gain near 1,000 to reduce tube errors down to chip errors and far below the tolerance of output iron.
Note that you still have a cap. The question is whether a 1uFd tastes better than the 10uFd-50uFd you would need to just AC-couple the transformer or direct output. (You don't really want "DC performance", and you won't get it with the servo.)