Michal Ludvig wrote:
> Jeff Ross wrote:
> 
>> /usr/local/bin/s3cmd --config=/home/jross/.s3cfg  put $i\
>>      s3://$LEVEL.dukkha.wykids.org/$i
>> [...]
>>
>> ERROR: Upload of 'profiles.gz.agf' failed too many times. Skipping
>> that file.
>>
>> Not only did it skip that file, it exited the shell loop and there
>> were about another hundred 10 MB files to go.
>>
>> Any ideas?  Are the return codes documented?
> 
> Hi Jeff,
> 
> since you're uploading just one file in each s3cmd call and that one
> file failed - s3cmd exited with an error.
> 
> Why don't you use "s3cmd put --recursive" (and del --recursive) to make
> your script simpler?
> 

Hi Michal,

Thanks for the prompt response as always!

You know, I had a reason for doing it that way but since I didn't write it 
into a comment in the script I don't remember what it was!

Will --recursive continue trying until the upload succeeds?  If not, I'd still 
like to know the return codes so I can determine that failure and either log 
it for manual intervention or return to the failed files after the rest are 
uploaded.

To keep the file size down to a manageable level, I'm using gzip | spit to 
chunk up the large dump files into 10MB sections.  If I ever missed getting 
one up to S3, that whole dump file is pretty useless.

Jeff

------------------------------------------------------------------------------
The NEW KODAK i700 Series Scanners deliver under ANY circumstances! Your
production scanning environment may not be a perfect world - but thanks to
Kodak, there's a perfect scanner to get the job done! With the NEW KODAK i700
Series Scanner you'll get full speed at 300 dpi even with all image 
processing features enabled. http://p.sf.net/sfu/kodak-com
_______________________________________________
S3tools-general mailing list
S3tools-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/s3tools-general

Reply via email to