tom arnall wrote: > i am trying to us a perl application that handles large files (.5GB) in > scalar > variables. when i try to 'slurp' one of these files into a variable > (e.g., '$_ = `cat filename`) i get an out of memory error.
You should consider alternative solutions, like processing files line by line for example. It highly depends on what you are trying to achieve, but loading half a gig into memory is not something you should do without having a very good reason for doing so. > is there not some way to get linux to use swap space when physical memory is > exceeded? It does that by default, assuming you do have swap partitions or files available. Note that this can be slow, and I assume this is what you described as a "freeze". There are many tunable Linux virtual memory parameters you can set via sysctl interface. You can get a list of them with "sysctl -a | grep vm". The most useful one is vm.swappiness. You can set it to a number from 0 to 100. 0 means "try not to use swap at all" and 100 - "swap as much as possible" with all other values in between. You can set it (and other sysctl options) like this: # sysctl -w vm.swappiness=60 To preserve settings on reboots, edit "/etc/sysctl.conf". -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]