My simulations in spice show just the opposite. Noise and distortion are far lower with a simple step up transformer (non-linear spice model of the transformer) followed by a discrete buffer vs. an active I/V stage copied from the DAC data sheet.
First of all, I think you're not understanding the problem. What we are concerned with is the interface between the DAC and the IV converter. You cannot simulate/measure the IV converter alone and conclude that all is well. The voltage at the output terminals of the DAC
must not see any apprechiable voltage swing, or excess distortion will be generated. This is because the output Z of the DAC is word- and output voltage dependent.
Second, it must be appreciated to which extent distortion is simulated by which models. Standard opamp models do not include nonlinear behaviour (beyond basic input/output limitations such as clipping, output current limiting and slew-rate). I have no experience with the non-linear transformer model of LTSpice, but unless you have compared simulated and measured distortion figures of a specific transformer you have no clue if you're on the right track. How do you determine the parameters of this model?
This means that you simply cannot simulate the performance difference between an active IV converter and a transformer implementation. You have to build it and measure it.
Samuel