On 26/03/2025 06:13, je.amg...@gmail.com wrote:

I'm using timed commands to set the RX gain at a precise moment with the following command:

set_command_time(md.time_spec + uhd::time_spec_t(0.02), 0);

However, I noticed that there is a delay between the specified time and the actual time when the gain is applied. This delay is significantly larger than the component latency responsible for changing the gain and appears to depend on the sampling frequency. Specifically, the delay is approximately 20 samples.

I’m trying to understand why this delay is much greater than the expected component latency and why it scales with the sampling frequency. Any insights on this behavior?

Regards.
Jamaleddine


A change in signals presented to the head of the DDC chain will take some number of sample times to propagate through the    finite-length filters in the DDC.  They don't (and, indeed, cannot) have zero group delay.


_______________________________________________
USRP-users mailing list -- usrp-users@lists.ettus.com
To unsubscribe send an email to usrp-users-le...@lists.ettus.com

Reply via email to