A circuit that automatically cancels out dc offset voltage of op amp?

Help Support GroupDIY:

abbey road d enfer

Well-known member
Jan 22, 2008
Dualflip said:

What's the benefit VS a conventional DC servo?

What about stability?
That's an interesting trick. Stability is unquestionable since it involves only one opamp.
Main drawbacks is limited capability; this wouldn't work with a power amp. Here it works because the opamp needs delivering only a small current (1mA to correct 47mV offset), but in a power amp it would need to be capable of delivering amps.
Even using it to servo a DOA wouldn't work. The typical output Z is about 0.01 ohm. Correcting 1mV of offset requires injecting 100mA. The output impedance must be artificially increased by adding a series resistor, which is not what you want when driving a transformer.


Well-known member
Jun 3, 2004
The Netherlands
JohnRoberts said:
Operating in the digital domain with some rudimentary logic (maybe a small microprocessor) you can use the output of the ADC itself to perform a self calibration with input (shorted). Using digital pots you can correct a decent range of DC offset errors. This will be fine for relatively short term measurements, but long term stability has other variables, like temperature coefficient of components and even devices.

Not necessarily shorting inputs, digital implementation (at least as in my patent) can operate in the presence of signal.
As said, in essence it's a digital implementation of a DC-servo. (Whether that's suited for the circuit of TS is another thing.)

FWIW, auto-zeroing (for the balanced circuit as was the DUT for the patent) can elegantly be done by a cross-switch before the ADC, and 'correcting' in the SigmaDelta-bitstream by simply inverting the bitstream. (Think of it as reversing up/down-counting when the analog cross-switch is in cross-position.)


Matt Syson

Well-known member
Dec 17, 2005
Please excuse the digression but I wonder how many 'fancy' audio A/D convertor implementations actually use the auto zero calibrate function after the whole unit has reached it's 'normal' operating temperature? To my mind failing to do this somewhat nulls the usual advertised 'high quality' performance of the unit if the calibration that is carried out at switch on (presumably cold) is then subject to the drift as the unit warms up by a good 30 centigrade.
Is the offset and drift of the circuit posted by Jakob going to be better than the original drift of a 'better' designed amplifier stage unless you use a high(er) precision servo amplifier?