Latency with ASIO drivers

twelti76

New Member
Thread Starter
Joined
May 30, 2019
Posts
38
Wondering if REW would get correct phase values (latency) if not using loopthrough, but using ASIO drivers? I did some playing around and noticed for example that the delay to IR peak did not chenge in a loopthrough measurement when I changed the soundcard (RME) buffer size. So it seems that ASIO drivers know and account for this. Can ASIO also account for OS latencies and ADC/DAC latencies? If so, we would only need physical loopthrough for non-ASIO sound interfaces (?)

I have a set of measurements which were made using ASIO/RME but without loopback turned on. I exported them using the suggested method where you go to IR Windows and set the Reference to 0 time and apply to all measurements, then export. I later made a loopthrough measurement and had thought to maybe use that to correct the original measurements, almost as if I had used the loopthrough in the first place. Now I'm not sure if the original measurements even need correcting.
 
It is REW that accounts for latencies. Undo t=0 changes resets t=0 to the as measured state.
 
So, if I make a measurement using ASIO driver, the default is that REW puts t=0 at IR peak, but if I undo that, do I get the true delay of the system under test? OR, if I set t=0 in IR Windows and export, I also get true correct system under test delays?
 
No. There are no absolute time markers associated with any of the audio data going out or coming in, it is all just a stream of samples. Without an acoustic or loopback timing reference connection there is no way to attach a reference time to the incoming data.
 
I guess I'm wondering if it is possible to measure the loopthrough later and apply the measured delay to correct a previously made measurement, made without loopthrough? Same soundcard, same sample rate, same buffer size (maybe).
 
Interesting: I tried doing a loopback measurement of a loopback. I used ch 5 for loopback and 6 for measurement. Seems maybe redundant but I actually got an acausal IR:
1718039885449.png

1718039911487.png
 
I saw that, but not sure where it came from. I simply turned on the loopback, set the correct channels to make a loopback measurement of a loopback and measured. I assume the timing offset was whatever it came up with for the "loopback", and was trying to correct for. Was it possibly something left over from a previous measurement?
 
It's the timing offset figure from the Measure dialog. It stays as you set it until you change it.
 
Hmmm, I don't remember having ever changed it in the first place. I was playing around with some of the options to set or reset the 0 time reference. Maybe it happened then?
 
It's either set manually or by using the Estimate IR delay option to move IR and update the timing offset.
 
OK, I did use that at some point a couple of days ago, so maybe it got set then. But for a normal loopback measurement, I just set it to looback mode, set the correct loopback channel and measure, with the "timing offset" set to 0 then?

Seems like there would be some similarities between estimating delay and setting "timing offset" or just setting t=0 at IR start or IR peak then? I guess using the "timing offset" would allow more flexibility.
 
OK, so now most of that delay is gone but still a couple of samples acausal (?). The delay is small, at around 10uS, possibly 2 samples at 192, if we assume that 0 time should be just before the start of the IR.
1718046494053.png
 
Back
Top