Dave Dykstra wrote: > > Note that his case is rather pathological because he's got over a million > files in only 400 directories, so he must have an average of over 2500 > files per directory, which are very large directories. He's got about 65% > of the files explicitly listed in his --include-from file.
I have over a million files I rsync to about a dozen locations every day. I'm pretty sure I have more dirs, but not tons more. Most of the locations are remote offices, but when I rsync over a local 100Mbit segment it still takes about 2 hours just to verify the files/dirs on both sides in a no-data-change situation. In other words -- I'm interested in these different optimizations as well. My client/servers are a solaris linux mix. I don't have the 2G of RAM on all boxes to support a single rsync so it gets broken down into a for-loop across some top level dirs. eric