Hi Ken
As a design goal, it would be helpful to the OD process if the delay could be made consistent within about +/-5 micro-sec -- i.e., 1500 meters (smaller is better). It is also important to quantify the standard deviation to that bias is since that is also factored into the OD process.
I may be wrong, but I can't think of a reason why it wouldn't be well under this figure. I was thinking more in terms of 100's if not 10's of nanoseconds.
The SDX (a) takes a batch of samples, (b) does its processing, then (c) outputs the processed batch of samples.
Each of the three parts takes the same amount of time and all three are executed in parallel like a production line using DMA for (a) and (c), and the DSP CPU for (b). This time is dictated by the CPU's crystal oscillator.
I am guessing any variation would be predominantly from the analog circuitry (R's and C's changing value with temperature?), but Bob may have other suggestions I hadn't thought of.
73, Howard G6LVB