Hi Nigel, > Perhaps it should be sleeping for .001 seconds upon first retry, instead > of .01 seconds? Even better might be to just retry once with no sleep > and then sleep for .001 seconds per chunk on the second retry.
give the attached patch a try - it'll make the first re-try without throttle and only subsequent ones would be throttled. Ideally these times should be configurable in .s3cfg file, for instance something like: throttle_retries = [0, 0.001, 0.01, 0.05] I may add such a config option in one of the upcoming versions. Michal
Index: S3/S3.py =================================================================== --- S3/S3.py (revision 395) +++ S3/S3.py (working copy) @@ -504,7 +504,8 @@ if self.config.progress_meter: progress.done("failed") if retries: - throttle = throttle and throttle * 5 or 0.01 + if retries < _max_retries: + throttle = throttle and throttle * 5 or 0.01 warning("Upload failed: %s (%s)" % (resource['uri'], e)) warning("Retrying on lower speed (throttle=%0.2f)" % throttle) warning("Waiting %d sec..." % self._fail_wait(retries))
smime.p7s
Description: S/MIME Cryptographic Signature
------------------------------------------------------------------------------ Enter the BlackBerry Developer Challenge This is your chance to win up to $100,000 in prizes! For a limited time, vendors submitting new applications to BlackBerry App World(TM) will have the opportunity to enter the BlackBerry Developer Challenge. See full prize details at: http://p.sf.net/sfu/Challenge
_______________________________________________ S3tools-general mailing list S3tools-general@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/s3tools-general