Hello,
You mention that you send a string to the microprocessor that tells it
how many bytes to send. Instead of requesting 512 bytes, try reading
10 times and only requesting about 50 bytes at a time.
If that doesn=92t help, try directly communicating with your
microprocessor through HyperTermi
I would check the instrument manual to see if there is a limit on the
maximum number of bytes that it will send in one packet. Still try
getting it to send 255 bytes and see if that works. If it does, then
try 256 bytes to see if the limit is there (a one-byte counter for
number of bytes sent). If
Hello,
Can you elaborate what you mean by the system is lost?
Ben has suggested a good idea of posting the code in question. It
would also be helpful to know which version of LabVIEW you are using
and which serial VIs you are using. His clarification of the timeout
for the serial read is also a
Hi Deeply Annoyed,
Please post your code or an image of same.
We a pretty good about getting issues like this nailed if we have
visual aides.
Otherwise we have to continue the guessing.
Re: Timeout
I think Joe was talking about the "Timeout" for the serial read, not a
wait timer.
Like I said,
I pull fiftyfour bytes of data from MicroProcessor's EEPROM using
serial port. It works fine. I then send a request for 512 bytes and
my "read" goes into loop condition, no bytes are delivered and system
is lost
I pull fiftyfour bytes of data from MicroProcessor's EEPROM us