Parameterization of DAC measurements using REW RTA and their impact on the results

johny_2000

New Member
Thread Starter
Joined
May 15, 2024
Posts
21
Hi all,

I need help understanding the parameterization of DAC measurements using REW RTA.
I ran a series of measurements on a test DAC. In this case,
DAC = RME ADI-2/4 Pro SE, XLR output +19 dBU = 6.904 Vrms.
ADC = E1DA Cosmos ADCiso, Mono mode, 3.500 Vrms input = 13.1 dBU.
AES17 Notch = E1DA Comsos APU, +0 dB gain.

I settled on the following measurement method in REW RTA (latest stable version):
- Generator 1 kHz RTA FFT (64k), 0 dbFS, "FS Sine Vrms" = 6.904 V.
- Distortion settings: 'High pass' / 'Low pass' 20Hz...20,000Hz.
- Checkbox "Manual fundamendal" = 6.904 V.
- Checkbox "Use AES17-2015 Standard notch"
- RTA window top right side "FS Sine Vrms" = 3.500 V.
- RTA input "Cal Data" has selected the APU Notch frequency response calibration file.

So here are the results:
ADI-2_4.jpg


Does everything look correct in this measurement?
I am confused by the over-optimistic value (-128 dB) of "THD+N" from this DAC .
 
Last edited:

johny_2000

New Member
Thread Starter
Joined
May 15, 2024
Posts
21
Next step is I'll take a True RMS multimeter and measure the following voltages:
- DAC output voltage,
- AES17 Notch output voltage = ADC input voltage,
- Actual ADC 0 dBFS voltage

I have a feeling that there are some (minor) differences in the I/O characteristics of the devices.
Here's why I think so:
Notch.jpg

Signal path amplitudes: +19 dBU (DAC) - 30 dB + 20 dB (Notch) = +9 dBU at the ADC input

ADC: +13.1 dBu (3.5 Vrms input) = 0 dBFS; +9 dBu signal applied = -4.1 dBFS measured.
The actual RTA shows a measured ADC value of -5.28 dBFS, which is -1.18 dBFS less than the devices' spec settings.
So we have either a different output voltage from the DAC, or more / less gain from the AES17 notch filter, or other actual value Vrms = 0 dbFS for ADC.
Or a combination of all of these factors.
 
Last edited:

johny_2000

New Member
Thread Starter
Joined
May 15, 2024
Posts
21
Also, I'm curious about these two settings in the Analysis Preferences tab:
"Limit cal data boost to 20 dB" and "Apply cal files to distortion"
My AES17 Notch calibration file reaches 30 dB values at certain frequencies.

I don't have a PC nearby to see what they are currently set to, but how do I set them up correctly in my case?
Should I enable both of them?
 
Last edited:

John Mulcahy

REW Author
Joined
Apr 3, 2017
Posts
7,996
My AES17 Notch calibration file reaches 30 dB values at certain frequencies.
Then you should not select the option to limit the cal data boost. Applying cal files to distortion is appropriate if your notch has any significant attenuation of harmonics.
 

johny_2000

New Member
Thread Starter
Joined
May 15, 2024
Posts
21
I'mThen you should not select the option to limit the cal data boost. Applying cal files to distortion is appropriate if your notch has any significant attenuation of harmonics.
Thank you, John.

I unchecked option to limit the cal data. I checked option to apply cal data to the distortion. Also, I calibrated the output levels of signal generator and ADC. Neither of these made any difference to the RTA's distortion report. The only option that significantly changed the distortion report in RTA was "Manual fundamental". I ended up unchecking it because I don't know how to enter it correctly.

I have a notch filter in my circuit that does -10 dB of attenuation from the fundamental level of the signal generator.
+19 dB (~6.91 Vrms) Input, +9 dB (~1.96 Vrms) Output.

Could you help me with it, please? What value should I enter in the "Manual fundamental" field?
 
Last edited:

johny_2000

New Member
Thread Starter
Joined
May 15, 2024
Posts
21
I also have a calibration file applied to the input device channel that contains the following information:
12 70 0
18 35 0
20 33 0
22 31 0
25 30 0
30 30 0
40 30 0
60 30 0
100 30 0
150 30 0
250 30 0
500 30 0
700 30 0
800 30 0
900 25 0
950 20 0
995 0 0
1000 0 0
1005 0 0
1100 25 0
1200 30 0
1300 30 0
1500 30 0
2000 30 0
2500 30 0
5000 30 0
10000 30 0
11000 30 0
15000 30 0
19000 32 0
20000 33 0
21000 34 0
24000 36 0
35000 70 0
60000 80 0
100000 100 0
 

John Mulcahy

REW Author
Joined
Apr 3, 2017
Posts
7,996
I also have a calibration file applied to the input device channel that contains the following information:
That is bizarre. The cal file should contain the gain response of the notch filter. See the help under Manual fundamental:

A value for the fundamental level may be entered for use in harmonic distortion calculations when the fundamental is attenuated by a notch filter. The level is in Vrms and should take account of any gain introduced after the notch. That ensures REW uses the correct fundamental level when calculating distortion. To compensate for the effect of the notch filter on harmonic levels its response can be loaded as a calibration file. For example, make a sweep measurement of a loopback connection firstly with the notch filter in place, then another without it, then use the Trace arithmetic feature of the All SPL graph to generate (notch response)/(no notch response) and export that result as a text file to be loaded as a mic cal file. Alternatively just measure the notch response, offset the measurement so that the dB values correctly reflect the notch filter loss at the harmonics (the fundamental is not critical since Manual fundamental deals with that) and export that offset notch response as text and load it as the mic cal file. The harmonic levels will then be corrected to allow for the filter's attenuation. The fundamental levels shown by the graph trace will typically be in error due to shifts in the centre frequency of the notch, but they will not be used if Manual fundamental is selected.
 

johny_2000

New Member
Thread Starter
Joined
May 15, 2024
Posts
21
That is bizarre. The cal file should contain the gain response of the notch filter.
This is essentially the gain response of an active notch filter. I also made my own frequency response file with a REW sweep and the 1 kHz point adjusted to 0 dB, but the difference was negligible (a fraction of a dB), but there were many more frequency points. I also tried using both in RTA (the input calibration file), and the distortion panel readings were only a fraction of a dB different.

See the help under Manual fundamental:
Thank you. I have read this section very carefully and also made True RMS DMM measurements of the signal path.

Can you please tell me which voltage should I enter in the "Manual fundamental" field?
Before notch (DAC output) = 6.904 Vrms @ 1kHz
After notch (ADC input) = 1.956 Vrms @ 1kHz
 

Attachments

  • APU+0dB_Notch.txt
    3.5 MB · Views: 11,018

johny_2000

New Member
Thread Starter
Joined
May 15, 2024
Posts
21
And one more noob question:
1725220181995.png

If I can't get 0.0 dBFS reading at my ADC input using signal path gain, how do I use -5.28 dBFS reading in THD+N reading?
For example, N+D(A-wt): -124.5 dBFS - (-5.28 dBFS) = -119.22 dBFS or am I wrong?
And what about THD+N(no-wt) = -118.1 dB or do I subtract/add that -5.28 dBFS too?
 

John Mulcahy

REW Author
Joined
Apr 3, 2017
Posts
7,996
If the RTA has been calibrated for the ADC levels the manual fundamental should be the 1.956 V figure.
 
Top Bottom