On 09/03/2023 10:14, Wolfgang Wilde wrote:
In fact, in real world and with real measurement devices, the units
are not related to "instantaneous voltages" and even lesser "voltages
seen at the antenna". Any form of antenna may deliver different
voltages. Is it some Yagi antenna? Or is the active element a closed
loop? Both forms would differ totally in "antenna voltage" and both
forms will transform the "antenna signal" over impedance
transformation to for example some 75 Ohms impedance cable for
consumer products or some 50 Ohms impedance cable when you have some
TRX system. Real RF _measurements_ are all related to RF-power. Even
the often used db/µV targets at a given power, as it only gives valid
information when you also provide the impedance of your RF system.
This means: talking about a signal of 40 dB/µV could be both, enough
for FM receiption or to less for it, depending on what impedance (50
Ohms/ 75 Ohms/ 240 Ohms?) you are refering to!
Never the less you do not have any SDR-Sticks out there with given
sensitivity nummbers. Even the more expensive devices like HackRF and
all the Ettus Research devices are not "measuring" some RF field
strength. All you get is somehow a resulting numeric factor of how
good the SDR can detect some signal. The sensitivity does vary widely
over the frequency ranges of this devices and it is by no means really
directly proportional to some RF power or even a voltage at the
antenna port or your antenna. You may to some degree use SDR-Sticks or
devices like HackRF, USRP's and so on for qualitative informations
about a radio signal, but it won't tell you anything about the real
power of a signal, as opposite to RF measurement systems! With all my
devices (ranging from RTL2832 based sticks over HackRF and others) you
get jumps in signal strength when you do a full band power scan. This
happens for example, when the SDR hardware switches to some other
oscillator setting. All of my devices can do a full band scan of at
least 1 GHz or even more, but none of them can do it by just stepping
up the Synthesizer PLL without for example using harmonics of the base
PLL Frequency for mixing down to the IF band or Baseband. And for each
time switching to another harmonic you get signal strength jumps. And
with each other PLL frequency you might get other mixing products,
ghosting from other frequencies and so on. So, the kind of SDR we
refer to here can not give any numeric factors to RF power nor RF
voltages. It is just a numeric value, somehow more or less
proportional to the quality the SDR can receive the signal. No units,
no absolute values at all.
Regards
I'll repeat this again, because you clearly didn't "get" it. The
numbers coming out of the signal-processing chain are
*linearly proportional* to the instantaneous (as in at the moment the
sample was taken by the ADC) voltage as seen
at the antenna input to the receiver. This says *NOTHING* about the
antenna itself, nor whatever cabling and other
bits-and-pieces are between the receiver input port and the antenna.
YES, the calibration constants will, absolutely, vary over the
tuning/bandwidth/sample-rate capabilities of the receiver.
But if they *ARE NOT* linearly-proportional (or mostly-so) to the
instantaneous antenna-*PORT* input voltage, then we
might as well all go home.
The samples that arrive into your flow-graph aren't some handy-wavy
random thing that kinda-sorta represents the
real world. They are very-definitely a *linear proxy* for the
instantaneous voltage as seen at the antenna input port
at the time that it was sampled. Again, if they aren't very close to
this, then we might all just as well go home and
take up basket-weaving.
What IS true about actual laboratory measurement instrumentation is that
such instrumentation is *calibrated* over
it's operating range (in as many steps as seems reasonable) to
produce results that are directly-tied to physical units.
You can do EXACTLY THE SAME THING with even a cheap receiver like the
RTL-SDR, HackRF, USRPs, LimeSDRs, etc, etc.
In fact, USRPs (some of them) now have a *CALIBRATION INTERFACE AND
API* that allows you to use them in the
same way as you can use a laboratory instrument.
In MANY actual cases, you'd like your radio to be calibrated over some
much-smaller range of its operating parameters--
you'll be using it for perhaps a single application, where
understanding what is appearing at the antenna input ports
in terms of power or (by a bit of simple math, voltage) may be important.
In MY case, I calibrate in degrees-K of noise power, because that's
relevant to my usage of these types of radios for
radio astronomy.
Your post makes it seem like SDRs are delivering samples that bear only
the weakest relationship to the physical world,
and that just isn't true. IT CAN'T BE.