Hi,
 
I've been using a waveform driver with known amplitude to calibrate the counts 
reading on the USRP/LFRX when I use the usrp_cfile_rx.py script file with 
decimation of 32 (2 MHz sampling).
 
I performed the calibration using 0dB gain on the USRP, and 10 dB gain for 
various frequencies from 10 to 150kHz.  The difference in measured voltage 
between 0dB and 10dB was about a factor of 3.04 for all frequencies.
 
I know that the voltage gain with 10dB should be a factor of 3.2.  Does anyone 
know if this is purely in an ideal state?   I'm guessing that is the case and 
my setup is working properly?
 
Thanks,
Dan
 
 


      
_______________________________________________
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
http://lists.gnu.org/mailman/listinfo/discuss-gnuradio

Reply via email to