Measuring distortion with a soundcard and software

GroupDIY Audio Forum

Help Support GroupDIY Audio Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

RuudNL

Well-known member
Joined
Apr 26, 2009
Messages
3,274
Location
Haule / The Netherlands
I noticed that some people measure distortion with a soundcard and software. (REW etc.)
The question is: what are you measuring? In fact you measure the combination of the device under test and the distortion of the soundcard! Although most manufactureres specify a THD value in the order of hundreds of percents, it seems that in reality these values can be much higher. To get a better impression of the THD, I have tested a cheap (Behringer) soundcard and a better one.
I injected a 1 KHz. sinewave with low distortion (<<0.01%) and analyzed the spectrum. The Behringer produced a distortion (many harmonics of 1 KHz.) with a total value of 0,8% The Focusrite Scarlett 8i6 was a lot better, with a THD of 0,02%, the highest self generated side product at -97 dB. Interesting...

120-b.png

120-s.png
 
Last edited:
Ideally, one would first take a loopback measurement and create a calibration file first; that would then get subtracted from subsequent measurements. Of course, that's more interesting to do, when gain or attenuation is involved in the device-under-test...
 
For the measurements I've posted (here) I'm using an iPhone SE's headphone output, with an app called 'f Generator Pro', as the source, and a Focusrite Clarett 8Pre Thunderbolt for capture.

At 100mV RMS, 1KHz, REW reports 0.00073% THD. Take that with a pinch of salt, but it's believably in the 0.001% (-100dB) range. The spectrum analyser plot is attached below.

The -97dB peak in @RuudNL 's plot looks like a harmonic of mains frequency, not a distortion product. REW doesn't include these in its THD figures (it gives THD+N separately), so it would be interesting to see what it would make of that spectrum.
 

Attachments

  • Signal source.png
    Signal source.png
    102.7 KB
I noticed that some people measure distortion with a soundcard and software. (REW etc.)
The question is: what are you measuring? In fact you measure the combination of the device under test and the distortion of the soundcard! Although most manufactureres specify a THD value in the order of hundreds of percents, it seems that in reality these values can be much higher. To get a better impression of the THD, I have tested a cheap (Behringer) soundcard and a better one.
I injected a 1 KHz. sinewave with low distortion (<<0.01%) and analyzed the spectrum. The Behringer produced a distortion (many harmonics of 1 KHz.) with a total value of 0,8% The Focusrite Scarlett 8i6 was a lot better, with a THD of 0,02%, the highest self generated side product at -97 dB. Interesting...

View attachment 133448

View attachment 133449
Well, I see no issues using a decent audio interface and REW to measure THD. But indeed, not all audio interfaces perform good enough to be used for THD measurements. The example you provide (UMC202HD) isn't half decent: close to 0dBFS, THD rises sharply. It's full of hardware bugs, most noticeably the Vdd on the CS4272 Codec chip, which is just 4.5V, where at least 4.75V is required. Having fixed that, and some other bugs, you'll have very good audio interface for audio measurements. I use one in my DIY Audio Analyzer. Loopback THD at 0dBFs is 0.0016% (-95.1 dB) and at -3dBFs just 0.00048% (-106.32 dB). Very usable, I would say. By contrast, a stock UMC202HD would produce a whopping 1.39% THD at 0dBFs and still 0.0061% at -3dBFs.

In this thread, post #19, you can download a zip file with a description of how I built my DIY Audio Analyzer and how I modified the UMC202HD.

Jan
 
I watched these videos below (DIY Recording Equipment) before starting to use REW for frequency response of my mic build attempts. If audio interface (calibrate it first in REW) THD is < -100dB, it should be good enough to even properly calibrate levels for S/N with a simple DC voltmeter? 1KHz sine 0 dBu = 775mV RMS, DC about 1.095V.











For just simple measuring THD >= 0.5% and voltages > 6V or so, I discovered my USB oscilloscope, Digilent AD2 is good enough. It is supposed to be 14-bits.

To find the proper bias voltage for the drain in my mic build, it went fastest by generating 1KHz sine wave between 100mV - 300 mV and looking at RTA graph and turn pot screw for the lowest 2nd harmony (give 1-2 seconds for the graph to update after turning). Even though I didn't increase the input voltage so the output signal was clipping, the bias voltage set at lower seems to be just about right.
 
Last edited:
For just simple measuring THD >= 0.5% and voltages > 6V or so, I discovered my USB oscilloscope, Digilent AD2 is good enough. It is supposed to be 14-bits.

... And pretty much all decent USB audio interfaces are 24bit. Granted, you might need to do some measuring of your own to know how dBFS translates to analog voltages, but still...
 
I noticed that some people measure distortion with a soundcard and software. (REW etc.)
The question is: what are you measuring? In fact you measure the combination of the device under test and the distortion of the soundcard! Although most manufactureres specify a THD value in the order of hundreds of percents, it seems that in reality these values can be much higher. To get a better impression of the THD, I have tested a cheap (Behringer) soundcard and a better one.
I injected a 1 KHz. sinewave with low distortion (<<0.01%) and analyzed the spectrum. The Behringer produced a distortion (many harmonics of 1 KHz.) with a total value of 0,8% The Focusrite Scarlett 8i6 was a lot better, with a THD of 0,02%, the highest self generated side product at -97 dB. Interesting...

View attachment 133448

View attachment 133449

No. You don't really do that if you use the facility to calibrate against the audio interface. Essentially measure the interface in loopback and compensate for that in the result. It's a built in function in REW. It's not perfect and you may need to adjust the sampling / windowing parameters to get an acceptable result - I've discussed this on here as I was basically compensating for the primary lf roll off but still had some ripple from the interface response. Reduced, if not eliminated, by altering window sizes. This is stuff happening well below 20Hz.
It's not an AP2 or a Prism Audio setup. But it's very useful for hobby and basic characterisation use.
 
Virtins Multinstrment software and Focusrite Scarlet Solo can be used to get very accurate readings of distortion, the floor being 0.0009%. Virtins has published test results with this and other soundcards. They have various AtoD DtoA interfaces available that can function as Oscilloscope and Analyzer front ends including DC. Many distortion and noise tests are included in the software as well as automated programmable / selectable testing. https://virtins.com/applications.html
 
Ideally, one would first take a loopback measurement and create a calibration file first; that would then get subtracted from subsequent measurements.

The two complications I think come along with that are:

1: loopback measurements show the combination of output and input non-linearity. One way to remove the output non-linearity is with passive filters after the output to remove the harmonic distortion of the output stage, but as far as I can see that would require using a limited number of test frequencies, and having switchable filters.

2: since you are trying to calibrate a non-linearity, the calibration will only be valid at one frequency and one level. There is probably a range where the calibration is close to accurate, but if you have to validate anyway, might as well just record all that data and have a collection of calibration files for various levels and frequencies. That seems cumbersome.
 
The two complications I think come along with that are:

1: loopback measurements show the combination of output and input non-linearity. One way to remove the output non-linearity is with passive filters after the output to remove the harmonic distortion of the output stage, but as far as I can see that would require using a limited number of test frequencies, and having switchable filters.

2: since you are trying to calibrate a non-linearity, the calibration will only be valid at one frequency and one level. There is probably a range where the calibration is close to accurate, but if you have to validate anyway, might as well just record all that data and have a collection of calibration files for various levels and frequencies. That seems cumbersome.

wrt point 1 - I'm not understanding your point there. The Calibration file is derived from a frequency sweep.
 
The Calibration file is derived from a frequency sweep.
Perhaps I was incorrectly lumping the posts from Kron and Ruud together. Calibrating from a frequency sweep lets you correct uneven frequency response, but you will still be limited in noise and distortion measurements by the distortion performance of the audio interface.
Perhaps I was mistaken, but since Kron's post was right after Ruud's post that "In fact you measure the combination of the device under test and the distortion of the soundcard!" I interpreted that as saying you could create a calibration file to subtract out the distortion of the audio interface to attempt to display only the device under test distortion.
 
Perhaps I was incorrectly lumping the posts from Kron and Ruud together. Calibrating from a frequency sweep lets you correct uneven frequency response, but you will still be limited in noise and distortion measurements by the distortion performance of the audio interface.
Perhaps I was mistaken, but since Kron's post was right after Ruud's post that "In fact you measure the combination of the device under test and the distortion of the soundcard!" I interpreted that as saying you could create a calibration file to subtract out the distortion of the audio interface to attempt to display only the device under test distortion.

Pretty sure the calibration file in REW only deals with the frequency response, not the THD as well. You can make a random one and open it in a text editor, it's a plain-text list of frequencies and deviations from "flat".
 
Virtins Multinstrment software and Focusrite Scarlet Solo can be used to get very accurate readings of distortion, the floor being 0.0009%. Virtins has published test results with this and other soundcards. They have various AtoD DtoA interfaces available that can function as Oscilloscope and Analyzer front ends including DC. Many distortion and noise tests are included in the software as well as automated programmable / selectable testing. https://virtins.com/applications.html
I am surprised they used the headphones output rather than the line output for their tests. I could not find an explanation for this.

Cheers

Ian
 
I am surprised they used the headphones output rather than the line output for their tests. I could not find an explanation for this.

Cheers

Ian
I suspect the headphone output has either lower distortion or lower noise cuz it is an unbalanced output. Or it could be as simple an explanation as they didn't have a female xlr on hand and couldn't be bothered buying one.

RH
 
I suspect the headphone output has either lower distortion or lower noise cuz it is an unbalanced output. Or it could be as simple an explanation as they didn't have a female xlr on hand and couldn't be bothered buying one.

RH

It would be bizarre to have lower noise/distortion on headphone output compared to a line output imo. And I'd say regardless of whatever flavour of balanced output a soundcard has, the line output(s) are still the outputs that matter though.
And given the kit they sell it would also seem bizarre to do all the measurements but not bother to obtain the appropriate connectors.
Interesting kit though. Thanks for the heads up.
 
Back
Top