SPL V/V graph help

luigi57

Registered
Thread Starter
Joined
Nov 27, 2024
Posts
19
I'm using REW 5.31.3 on WIN11 and I'm measuring the frequency response of a preamp; on the SPL graph I read a 7 dB gain (V/V), but the difference in the level between reference and input is more than 20 dB: why the graph show this 13 dB difference?
It seems that the graph use the output signal instead of reference input measurement to calculate the gain.
Is it something wrong in my set-up of REW?
Thank you to help me.
 
Last edited:
Did you calibrate the full scale voltages for the generator and RTA?
Hi John,

I calibrated generator but I didn't calibrate RTA; after RTA calibration the gain @1 KHz increased of few dB: now the graph shows 21.2 dB, but the difference during measurement in reference and headroom is 27.7 dB, same difference I read as difference between IN and REF IN in Levels tool. I don't understand what affects this 6.5 dB difference; my suspicion is again that the graph value is calculated with generator output signal setting, instead of ref in value (my sound card has an output impedance of 600 Ohm and since I'm measuring a preamp with 600 Ohm input impedance -6 dB is lost in generator output impedance).
 
The dBFS differences depend on the relative voltage scalings of the output and input.
I fully agree with you, but why in this case the difference is exactly the gain I'm measuring, while the graph shows a different value?
I checked the gain with a calibrated AC millivoltmeter and it's 27.7 dB.
 
Too little information to comment. What are the generator and RTA settings and the full scale input and output voltages? Which graph are you looking at, what frequency? Why would the level of the ref input have anything to do with the gain? If you are referring to the results of a sweep measurement, attach it.
 
Too little information to comment. What are the generator and RTA settings and the full scale input and output voltages? Which graph are you looking at, what frequency? Why would the level of the ref input have anything to do with the gain? If you are referring to the results of a sweep measurement, attach it.
I made a sweep from 0 Hz to 48 KHz to get the frequency response and I checked gain at 1 KHz.
Gain is defined as Vout/Vin and since Vin is connected to ref input and Vout connected to In input, I was expecting to get the result of this ratio in dB.
I'm not measuring distortion at the moment, therefore RTA is not yet used.
See attached .mdat file for the measurement I'm talking about.
 

Attachments

Have you compared the voltage at the input to the device you are measuring with the voltage at the soundcard output? REW's gain is based on input signal level versus stimulus signal level, taking into account the full scale voltages for output and input.
 
Also worth comparing the no-load output voltage at the chosen sweep level with the output voltage when loaded by the device, which presumably accounts for most of the difference between REW's figure and your expectation/loopback ref vs input levels.
 
Have you compared the voltage at the input to the device you are measuring with the voltage at the soundcard output? REW's gain is based on input signal level versus stimulus signal level, taking into account the full scale voltages for output and input.
Yes, I made this check and with 30mV output set for generator, the voltage at the input of the device I'm measuring is 14,4mV (-6.37 dB) and this account for the difference I find.
Why are you using stimulus signal and not reference input measure of the sound card output? Sound card generator output impedance is not taken in account in your strategy if a power amplifier are not used : I suppose you created REW to measure loudspeakers when a power amplifier with near to zero output impedance is always used with no issue.
You are you using sound card reference input only during impedance measurements: why not to extend REW by using also reference input during measurements? It will increase accuracy also for loudspeakers measurements.
I used REW in a wrong way!
 
There is an option to use the reference input as a cal source. The V/V scaling is independent of any timing or cal reference, since those may not be used and the ref input, when used, may not be connected to the output but can also come from some intermediate point along the measurement path. The easiest solution for your case is to calibrate the generator full scale voltage when attached to the load.
 
There is an option to use the reference input as a cal source. The V/V scaling is independent of any timing or cal reference, since those may not be used and the ref input, when used, may not be connected to the output but can also come from some intermediate point along the measurement path. The easiest solution for your case is to calibrate the generator full scale voltage when attached to the load.
John,
your hint is good but it can be used only if the input impedance of the device to be measured is constant over frequency, otherwise calibration will not be of help.
 
John,
I've a question on impedance measurement: is it possible to make impedance measurement by putting DUT in series configuration, instead of parallel configuration (see attachment)?
I want to measure audio transformers that have a very high impedance (150H) and series configuration seems to be more suitable to measure high impedance than parallel configuration. In this configuration a low value resistor (i.e. 1 Ohm) is used to measure DUT current: by using a 2 channel measurement with differential inputs, impedance will be the ratio of the voltage across the DUT and the voltage across the current sensing resistor (in case it is 1 Ohm), Z=Rref x (Vz/Vr).
 

Attachments

  • Z serie.PNG
    Z serie.PNG
    16.7 KB · Views: 15
There's no functional difference between that and what REW does. REW measures the drop across the reference resistor, which is in series with the load.
 
There's no functional difference between that and what REW does. REW measures the drop across the reference resistor, which is in series with the load.
Ok, but how is it measured the drop across the reference resistor? As difference between the voltage of the 2 channels of the audio interface or what else?
In my suggested configuration the drop across the reference resistor would be measured directly with only one channel, being it differential, and both measuring channels wouldn't have the ground in common, therefore the drop across the reference resistor couldn't be measured as difference between the two. I will try, anyway, this configuration and I will see if the result will be as expected and correct.
In my configuration no calibration for impedance measurement should be required, being the ratio of two voltages.
Thank you.
 
Last edited:
Calibration compensates for input impedance, lead impedance and channel gain and response differences.
 
John,
I'm measuring the input impedance of an audio transformer and I noticed a strange behaviour: by measuring it with different frequency ranges, the impedance and the phase are not the same values by checking them at the same frequencies: what can be the issue?
I used a 10K sensing resistor with 1Vrms generator signal for all the measurements.
What am I doing wrong?
Thank you.
 

Attachments

Transformers are typically not time invariant, their behaviour depends on the history of the signals that have passed through them. Different frequency range sweeps have a correspondingly different frequency vs time profile which may affect the core's behaviour and hence the measurement. I would speculate that is the cause of the differences you see.
 
True, but I don't think we are in this situation, because if I repeat several time the same measurement the result is always the same. I use a 4M lenght with one repetition and the measurement time is over 30 seconds: to check my feeling I can try to reduce the lenght in order to have, more or less, the same frequency vs time profile for the different sweeps ranges. I will let you know.
 
How can I set a log sweep for impedance measurement?
When I make the measurements in the left side overlapping panels the text says log sweeps, but when I make export of the measurement, all frequencies have a fixed step.
 
The measurement sweeps are logarithmic. You can choose the export resolution (points per octave) on the export dialog.
 
Thank you for the hint, John.
Why have you limited Rsenseto 10 KOhm? When measuring transformers with high inductance (in my case 160 H) the drop-out on it is 1% of the applied signal and the resolution is limited: by increasing Rsense to 1MOhm, in example, will increase resolution. Even Rinput is limited to 1 MOhm and when using a 10 MOhm probe it cannot be set correctly, even if its influence is limited.
 
Accuracy will be reduced at higher impedances, even with calibration, but I'll expand the accepted range of values for the next build.
 
Thank you!!
Sensitivity is maximum where measured voltage is half of the generated voltage, therefore using Rsense not less than 10 times and not over 10 times the magnitude of the measured impedance is the best option
I'm sure it will increase the range of measurement and it will be appreciated by the users.
 

Attachments

  • Sensitivity.png
    Sensitivity.png
    18.4 KB · Views: 6
Last edited:
Back
Top