------------------------------------------------
On Wed, 27 Aug 2003 16:18:00 +0200, Shahar Evron <[EMAIL PROTECTED]> wrote:

> hi...
> I'm working on a CGI program that creates some user-specific file on the 
> server when accessed. is there a good way to make sure theese files are 
> cleared when they're no longer needed - IE if a file in a specific 
> directory was not accessed for 5 minutes, delete it.
> Right now i'm thinking a croned script - but it would have to be run 
> quite oftenly - won't that have a bad effect on my system?
> I'd love to hear some ideas.

This is highly dependent on the frequency with which you run the script, the number of 
files, possibly the size of the files, etc. cron in itself is going to be running 
anyways (most likely) having it fire up a single process to remove some files isn't 
terribly slow, until you are talking about large amounts of files or you run it very 
often like every couple of seconds.  So the question becomes how long "may" a file 
stay out there, because this determines how frequently you must run the script, in 
other words, if a file has not been accessed in five minutes but must be deleted 
before it is stale for 6 minutes then you have to run the script every 59 seconds. etc.

If you must take this approach and are on a unix system, I would avoid a Perl script 
and install a recent version of 'find' and use 'rm' instead. That should speed up the 
file location and removal without the overhead of the perl interpreter being fired up.

http://danconia.org

http://danconia.org

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to