Le 11 mars 08 à 17:54, Carl E. McIntosh a écrit :

Can you please give advice about handling large data files with memory management techniques? I am attempting to read three large files (1 GB, 208 MB, 725 MB) sequentially and place the data into arrays for processing. Here is my psuedocode:

1) Import a file into NSString.
NSString *aFileString = [NSString stringWithContentsOfFile: fileLocation]; // Convert file at path to myFileString text holder;

2) Use NSScanner pull out integers and floats
NSScanner *aFileScanner = [[NSScanner alloc] initWithString: aFileString];

3) Store values into arrays.
     float myFloats [100000][2000]; or
     float myInts [100000][2000];

4) repeat three times with 3 different files.

This algorithm works for smaller files but chokes on the larger files and I get malloc errors. I've attempted to use NSZone's to the same failure.

Can you please give advice about handling large data files with memory management techniques?

I have 4 GB ram and can hog off 2 - 3 GBs for the process. I don't know how to explicitly allocate real memory. I'd rather not use virtual memory. Any references or examples would be appreciated.

The first advice I can give you is "do not load the whole file into memory". Use read stream to read chunk of data and process them. (see NSInputStream or NSFileHandle).
Maybe other people on this list may have other advices too.


_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to