On Fri, Sep 3, 2010 at 5:07 PM, Suleja, Lukasz <lukasz.sul...@roke.co.uk> wrote:
> I am using the following code snippet to view a 100kHz sinewave on a virtual
> oscilloscope:
>
> self.interface_rate = 1e9
> self.src = gr.sig_source_c(self.interface_rate, gr.GR_SIN_WAVE, 1e3, 1)
> self.scope = scopesink2.scope_sink_c(panel, sample_rate=self.interface_rate)
> self.connect(self.src, self.scope)
>
> As I increase the frequency of the sine wave to 100kHz, the resolution of
> the plotted points deteriorates.
>
> With a sampling rate of 1MHz and a sine wave frequency of 100kHz, I would
> expect the time interval between plotted point to always be 1us. However, if
> I reduce the time-base and select the marker to be "Dot Larg" I can clearly
> see that the time-period is not the expected 1us.
>
> Please could some one shed some light on why there appears to be a
> discrepancy; or are my expectation just incorrect.

You use sample rate 1e9 = 1 Gsps - I think that's slightly above what
is possible on standard PC. Maybe that's just a typo?

Alex

_______________________________________________
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
http://lists.gnu.org/mailman/listinfo/discuss-gnuradio

Reply via email to