But why forking more processes? The cgi program might check which of the files need to be deleted, then create a temporary lock file, then it could fork a process that will delete those files. The next visitor will come and execute the same cgi script, but it will see the lock file and it won't delete any file.
In fact, if the web site has many visitors, the script could be put on a script which is not so often executed by all visitors. Or that script could check and start deleting files only after a period of time, let's say... 10 minutes, 1 hour... etc. teddy.fcc.ro [EMAIL PROTECTED] ----- Original Message ----- From: "drieux" <[EMAIL PROTECTED]> To: "cgi cgi-list" <[EMAIL PROTECTED]> Sent: Thursday, August 28, 2003 6:58 PM Subject: Re: automated file removal / cache clearing On Wednesday, Aug 27, 2003, at 14:22 US/Pacific, Octavian Rasnita wrote: > Or if you don't want to depend on Unix's cron and want your program to > do > everything, you can set it so each time a new visitor comes to your > site, > checks which files are not needed, and delete them. > You can use fork to avoid putting the visitors to wait until the > program is > doing its background job. [..] at first blush that CAN seem to be an interesting idea - but in the worst case one can have N connections, each of which has generated N forked children to walk through M possible files... and one starts asking one's self, is this an order N square or N factorial solution? while in the worst case the cron job based solution is merely an order N problem... -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]