Lai, Ann wrote:
> Hi,
>
> I get this message sometimes when I run my program:  System is low on
> virtual memory....  My program has two loops, and I use shift
> registers on these two loops.  I have a loop for data array and a
> loop for graphing the data array.  My data when put into excel is
> more than 1500 rows.  Is it giving me error because the data arrays
> are big and so it runs low on virtual memory?  Is there a way that I
> can dump this memory to a physical memory space, then read back
> memory and then write to Excel?  Right now, the program keeps
> concatenate the array, then puts all that data into Excel at the end.

I believe that the "concatenate array" function makes a new, slightly
larger, copy of the original array to add your new data on to the end, then
deletes the original. It's a little smarter in that it adds more space than
you need to reduce the number of times it has to do this, but it's still
slow and memory hogging. It may be one of your problems. Initialise the
array at the start and replace null entries with your data as you go. You
could also use a more elegant design that doesn't buffer all the data before
starting to write to Excel- a queue to a seperate Excel writer for instance,
unless the machine overhead isn't acceptable.

Another may be that Excel's not very good. There are better packages for
data manipulation- Origin for example. Or maybe Diadem if you want to stay
NI.

-- 
Dr. Craig Graham, Software Engineer
Advanced Analysis and Integration Limited, UK. http://www.aail.co.uk/





Reply via email to