i am trying to us a perl application that handles large files (.5GB) in scalar variables. when i try to 'slurp' one of these files into a variable (e.g., '$_ = `cat filename`) i get an out of memory error. (same for '@var = <FH>') initially, i ran the script without any memory limits. that was fine except the process would freeze the rest of the system. i dealt with this problem by setting limits for myself in /etc/security/limits.conf as follows;
kloro hard rss 256000 kloro hard memlock 256000 kloro hard as 768000 now, tho' i get 'out of memory' errors. is there not some way to get linux to use swap space when physical memory is exceeded? thanks, tom arnall north spit, ca -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]