Hello,

I'm working on the implementation of a TDMA mode which requires tags for time 
keeping and other purposes.
I've noticed some strange things about how tags are propagated by resampling 
blocks.

For example, the interpolating FIR filter requires declaring a sample delay 
equal to the filter delay but divided by the interpolation rate. This one has 
been discussed here before.

The strange thing is how the rational resampler handles tags. By default, 
without calling declare_sample_delay(), tags appear downstream *delayed* 
compared to the samples by what seems to be the exact filter delay. So it seems 
like they are actually delayed twice that value so they end up being too late 
compared to the bursts. If I try to call declare_sample_delay() the problem is 
made even worse, tags are even more delayed compared to the sample group.

I'm trying to understand what is happening and how I could stop this automatic 
delay of tags by too large of a value in the rational resampler, if anybody 
can provide any insights. I'd also be interested if the polyphase channelizer 
/ synthesizer blocks behave the same or have some other policy for tags.

Thanks,
Adrian



Reply via email to