I am little confused on the procedure and I do apologize for my lack of understanding for something that is probably very simple. The Mix Pre 3 would be generating the tone and then I would measure the resistance between + and shield on the output connection while signal is present, is this correct? My thought was to use a 1k multi turn trimmer between pin 3 and pin 1 and adjust while watching pink noise on an analyzer and adjust it till the level and response was optimal. Is this kinda of what you are saying?
Measuring the output impedance of a single-ended output is a fairly simple voltage divider calculation where R1 is the output impedance of the device and R2 is the load impedance. So if you start with R2 being relatively large like the 10K input of your typical audio interface, you should have little to no drop in voltage and now you know the input voltage. So you know the input voltage and you control R2 and you can measure the voltage drop as you make R2 smaller. So just add a resistor across the output to ground until the voltage drops by ~20-50%. Then take the input voltage, R2 and the output voltage and put it into an online voltage divider calculator like this one:
https://www.rapidtables.com/calc/electric/voltage-divider-calculator.htmlHowever, there are a few pitfalls here.
One is that you want to use a fairly high level steady signal like a sine wave at say 100Hz. If the level is too low, measuring levels will not be as accurate. If you use something like a noise source as you suggested, the level will not be steady. If you use a DMM, the AC voltage of your typical meter is only good at reading low frequency AC.
Two is if you don't have a DMM or oscilloscope, you might only be able to view dB levels in your DAW or similar. The solution for this is to convert the dB values to a ratio by using a different online calculator like this one:
http://www.sengpielaudio.com/calculator-amplification.htmFor example, let's say you setup your device to put out a high level sine at 100Hz and adjust the input level in your DAW to make it 0dB. Then you add an R2 across the device output of say 680 ohms and you see that the level in the DAW drops to -4 dB. Now you go to the above calculator and put in 1 for the "Input", -4 for "Level change" and click on "calculate value 2". That gives you a ratio of 1 / 0.63. Now you can go to the voltage divider calculator and use the ratio as "voltages". Specifically, put in Vin = 1, R2 = 680, Vout = 0.63 and then click on "Calculate" to populate R1 which results in 399 ohms. So 399 ohms would be the output impedance of this imaginary a device.
However, I somewhat doubt that the output impedance of the Sound Devices stereo out is "500 Ohms". That's probably just what it can safely drive. So I would make R2 470 ohms and see if you get a drop of 20% or so. It's very likely that you will not and you will need to go down to 200 ohms or lower and that you will find the output impedance is actually more like 100 or lower still.
There is a third pitfall here which is that all of the above does not work if the output is already impedance balanced. Clearly the stereo out on that Sound Devices thing is not already impedance balanced because the sleeve of the jack is grounded to the chassis and it's shared by both outputs so impedance balanced could not work. You need separate resistors to ground (which is precisely what you're trying to do with your special cable in this case). But in all other cases, most outputs of modern devices are already impedance balanced and so the above procedure is insufficient.