Can I accurately measure gain by sending a sine wave from my interface through a DI to the preamp and measuring the input and output of the preamp with an oscope?
Accurately is a big word, and most 'scopes aren't too good at showing mV-level signals, but for low to moderate gains this procedure should work fine. For higher gains you can build a simple U-pad (like Example 1 from
this page), connect it between DI and pre, and have the scope measure the difference in level between the signal before the pad and after the pre. Mic pre gain is now the measured difference plus the pad's attenuation. Oh, and when measuring with scopes ground levels and probe grounds may matter (unless you happen to have a few differential probes lying around, or you are sure your scope is one of the few models with true independently floating inputs).
(A simpler setup may be to work the other way around. Build a few pads, say 20dB, 40dB, 60dB, and a Y-cable. Put one of the pads before the pre, connect the Y-cable to your signal source, the pad and a VU meter. Connect the output of the pre to another VU meter. Adjust pre gain until both meters show the same level; at this setting the gain of the pre equals the attenuation of the pad. For VU meters you can substitute metered DAW input channels, of course).
At min gain I got nothing on the graph when I ran the analysis, which tells me that my method is not correct, because according to you and everyone else online, noise should be worse at min gain...
No, that's to be expected. Noise is referred to the input of the device. Take the THAT1510; according to its data sheet it has 17dB more input-referred noise at a gain of 20dB than at a gain of 60dB. However, with no signal present its output will show 23dB more noise at 60dB gain than at 20dB gain. This may sound weird, but consider that (barring clipping) an input signal will appear 40dB stronger at the output at 60dB gain than at 20dB gain, so the signal-to-noise ratio will be better at higher gain, as expected.
JDB.