My name is Chris, HI! I'm working on a backup application... and i'm using your Archive::Tar perl module (version .22). My question to you is, how can i "flush" the in memory archive after it has reached a certain condition...say like a certain size.
My problem is this...I'm trying to tar an unknown sized directory (with sub directories and sub sub directories, ect) and it can be anywhere from under one megabyte to almost a GIG! But i don't exactly have that much RAM (memory) if that even matters and after only 5 seconds or so when processing a directory that is about 250 megs perl already is taking up HUGE!!!! amounts of CPU resources (99%)... though strangely it's seems to only taking up about 3-5% of available memory. (i have only 64 installed) ?? Why ?? I guess i'm not understanding how the module handles the tar in memory. I'm getting the cpu and memory stats by watching the unix 'top' application. I've tried to flush the tar object by calling the following when the 'buffer' reaches a certain size: $tar->write('foo.tar', 0); $tar->remove(($tar->list_files())); ...then start adding the remaining files to the emptied $tar object (and flush again if the buffer exceeds the max size again) Which makes everything go REALLY fast... as i want... BUT it actually removes the contents of the $tar file, not just from memory.. That doesn't make sense to me because the write() method (i thought) wrote what was in memory to the file specified.... So as i see it my 'flush' above would first update the tar file with the contents in memory, then remove the files in memory, then add the new files to memory and if need be flush again. Should i flush the 'buffer' into mulitple spanning archives and combine them at the end of the whole process? Do you have any ideas? Thanks a bundle! -Chris -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]