you can use fslint to identify duplicate files and merge them (i.e. hardlink 
them to a single file).

[vbr...@volker-two ~]$ /usr/share/fslint/fslint/findup --help
find dUPlicate files.
Usage: findup [[[-t [-m|-d]] | [--summary]] [-r] [-f] paths(s) ...]

If no path(s) specified then the currrent directory is assumed.


When -m is specified any found duplicates will be merged (using hardlinks).
When -d is specified any found duplicates will be deleted (leaving just 1).
When -t is specfied, only report what -m or -d would do.

When --summary is specified change output format to include file sizes.
You can also pipe this summary format to 
/usr/share/fslint/fslint/fstool/dupwaste
to get a total of the wastage due to duplicates.

Examples:

search for duplicates in current directory and below
    findup or findup .
search for duplicates in all linux source directories and merge using 
hardlinks
    findup -m /usr/src/linux*
same as above but don't look in subdirectories
    findup -r .
search for duplicates in /usr/bin
    findup /usr/bin
search in multiple directories but not their subdirectories
    findup -r /usr/bin /bin /usr/sbin /sbin
search for duplicates in $PATH
    findup $(/usr/share/fslint/fslint/supprt/getffp)
search system for duplicate files over 100K in size
    findup / -size +100k
search only my files (that I own and are in my home dir)
    findup ~ -user $(id -u)
search system for duplicate files belonging to roger
    findup / -user $(id -u roger)

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org

Reply via email to