Hi,
I'm trying to explain an odd (and reproducible) observation for a
dataset I collected. The setup is as follows:
-A USRP-2974 is externally clocked with a 10MHz sin reference from a SRS
FS725 atomic frequency standard. Power is within recommended bounds.
-The same 10 MHz signal is introduced at one of the RX ports and is
recorded via the rx_samples_to_file utility ( nsamps = 5,000,000,000;
rate = 50,000,000; freq = 0; channel = 0; ref = external - I added
commas for clarity here).
-In software I then decimate by a factor of 1,000,000 by keeping only
1/million samples. The new dataset is then essentially the 10MHz sin
sampled at 50 SPS with the sampling clock being derived from the same
signal.
Now for my issue:
I don't expect the recorded IQ samples to be necessarily real, since I
assume that there would need to be some calibration between the RF
frontend upconversion mixer and the downstream IQ mixers. However, given
the presumptive phase and frequency determinism among the various
mixers, I do expect a constant sample-to-sample phase for the duration
of the sample collection window (i.e. there should be no phase drift
among successive samples). Attached is an image of the 50 SPS dataset.
The magnitude remains constant (expected), but the phase varies very
slowly, on the order of one cycle per minute. Could someone explain to
me the origin of this phase drift? From the block diagram of the 2974 it
appears that that mixers are driven from two PLLS that share the same
reference frequency input, so I find this effect puzzling.
Thanks,
Dominic
_______________________________________________
USRP-users mailing list -- usrp-users@lists.ettus.com
To unsubscribe send an email to usrp-users-le...@lists.ettus.com