I'm using timed commands to set the RX gain at a precise moment with the 
following command:

set_command_time(md.time_spec + uhd::time_spec_t(0.02), 0);

However, I noticed that there is a delay between the specified time and the 
actual time when the gain is applied. This delay is significantly larger than 
the component latency responsible for changing the gain and appears to depend 
on the sampling frequency. Specifically, the delay is approximately 20 samples.

I’m trying to understand why this delay is much greater than the expected 
component latency and why it scales with the sampling frequency. Any insights 
on this behavior?\
\
Regards.\
Jamaleddine
_______________________________________________
USRP-users mailing list -- usrp-users@lists.ettus.com
To unsubscribe send an email to usrp-users-le...@lists.ettus.com

Reply via email to