Hi Matis,

the time stamp of the first buffer after a disruption should indeed be
accurate, and describe the time at the first sample in that packet.
So, I'd say that, yes, you can use that to know how much you've lost.
Thus, let me ask you why you came to the conclusion that it is false?

Best regards,
Marcus

On Sat, 2018-05-05 at 01:34 +0200, Matis Alun via USRP-users wrote:
> Hi,
> 
> I remark that the "time_spec" passed to the "recv" method of an
> rx_streamer is not in
> exact relation
> with the length of the received vectors. For example, when receiving
> buffers of length
> 16384 at Fs=362319 Hz,
> the buffer length should be equal to (16384+1)/362319=0.0452225
> seconds, while the time
> spec difference
> between two successive calls to recv is 0.0483315 and sometimes
> 0.033402 (with a mean
> value equals to 0.0452225).
> 
> My first idea was that the time_spec returned by the recv method was
> the time stamp of the
> first sample of the
> buffer, but I realize that it is false. Is it exact ?
> 
> As a consequence, how can I know the number of losted samples when an
> overflow occurs ?
> 
> thanks.
> 
> matis
> 
> 
> 
> _______________________________________________
> USRP-users mailing list
> USRP-users@lists.ettus.com
> http://lists.ettus.com/mailman/listinfo/usrp-users_lists.ettus.com

_______________________________________________
USRP-users mailing list
USRP-users@lists.ettus.com
http://lists.ettus.com/mailman/listinfo/usrp-users_lists.ettus.com

Reply via email to