Hello, I'm wondering if there are any effective tactics for reducing the memory consumption of s3cmd sync jobs of directories with large amounts of files? I find that these jobs use an order of magnitude more memory than rsync, for instance. I realize that a different set of parameters are involved in syncing to an S3 volume as opposed to a more traditional file system, but if you have any suggestions I'd like to hear them :) --
Joe Auty, NetMusician ![]() www.netmusician.org j...@netmusician.org |
------------------------------------------------------------------------------ WhatsUp Gold - Download Free Network Management Software The most intuitive, comprehensive, and cost-effective network management toolset available today. Delivers lowest initial acquisition cost and overall TCO of any competing solution. http://p.sf.net/sfu/whatsupgold-sd
_______________________________________________ S3tools-general mailing list S3tools-general@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/s3tools-general