Roland Huettmann wrote:

> And a more generalized question also discussed before: What exactly
> happens when LC tries to read from a very very large file? Maybe it
> is Gigabyte or even Terabyte file? It could just be too big to read
> and it should then still not return empty or become unresponsive,
> but return some error message.

The responsiveness has to do with the tight loop not surrendering enough time for Windows' current liking. He covered that well in his earlier post:
<http://lists.runrev.com/pipermail/use-livecode/2016-April/225896.html>

This is independent of the size of the file, and really independent any file I/O operations at all. Other tasks in tight loops can also trigger Windows to consider an app "unresponsive" even though it's running.

Hopefully they'll be able to to massage the event loop to handle the latest Win API expectations, which have apparently changed in recent versions.

As for large files, I've had very good experiences parsing files larger than 6 GB. Given that this is well outside of any internal memory addressing, I'm assuming LC would work equally well on any size of file the local file system can handle.

This requires, of course, that we write our apps the way most apps are written: with very large files, rather than read the whole thing into RAM and expect one giant memcopy, we read in chunks and process each chunk separately, as your script does.

LC's internal addressing allows for a single block of memory to up to about 1 GB IIRC (Mark, please correct me if that's not right), which is far larger than most operations will be able to handle efficiently anyway.

Which leads us to:

> And what happens when very large amounts of data are read into memory,
> processed there, and placed into a field? Is there anything preventing
> unresponsiveness?

Usually yes, and out-of-memory error should be thrown. But low-memory situations are very tricky: if there isn't enough memory to complete execution of the script, there may not be enough memory to report the error.

Mark can probably provide more details on that, but I've seen iffy handling of low-memory situations with a wide range of applications and even operating systems. It's just a hard problem to solve with consistent grace.

But fortunately, low-memory situations are rare on modern systems if we just remain mindful of how to use memory efficiently: read stuff in chunks, process what you need, and if an aggregate operation results in a large amount of data consider flushing it to disk periodically as you go - "open...for append" on our output file is very efficient for that sort of thing.

--
 Richard Gaskin
 Fourth World Systems
 Software Design and Development for the Desktop, Mobile, and the Web
 ____________________________________________________________________
 ambassa...@fourthworld.com                http://www.FourthWorld.com


_______________________________________________
use-livecode mailing list
use-livecode@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-livecode

Reply via email to