But what's the "real" value. There's some uncertainty in reading it
back with an analog input channel. The floating point number used to
program the device is converted to a 12/14/16 bit (or whatever
resolution of the DAQ card is) number and then output. So the output
may not be "exactly" what it is set to. But then reading back through
the analog input, another conversion takes place. Is this reading any
more accurate? If the board is correctly calibrated, I assume that the
output voltage is what I set it to plus or minus the resolution of the
D/A converter.

Reply via email to