At various places around our system we want to clean up files older than x, and sometimes prune empty directories. Naturally we have to be careful doing this lest we accidentally blow away far too many of the wrong files.

I'm thinking about a Perl module and accompanying script with this interface:

cleanup_files.pl --age=age --dir=dir --name=name [--dry-run] [-- prune-empty-dirs]

where age can be specified as "1h", "2day", etc., and name is a required glob pattern, and dir is checked to make sure it is sufficiently deep (e.g. can't use /). --dry-run tells you what would be deleted. --prune-empty-dirs also causes empty dirs to be pruned. The script would report at its end how many files and directories were removed.

The idea is to have a convenient, but safe, one-liner to put in a cron for each directory that needs periodic cleaning.

In the past we've done the old "find ... | xargs rm -f", but it doesn't have the safety checks, directory pruning, or reporting.

Does anyone else think this is (mildly) valuable? Am I reinventing the wheel, in terms of Perl libraries or other Unix utilities besides basic find?

Thanks
Jon

Reply via email to