Were you able to see where to fix that?
On Sat, Jan 8, 2011 at 10:28 PM, Jim Popovitch wrote:
> On Mon, Jan 3, 2011 at 23:59, Jim Popovitch wrote:
> >> From: Jobe Bittman [mailto:j...@opencandy.com]
> >> Sent: Thursday, November 04, 2010 10:01 PM
> >> To: s3tools
I generally take my file list and feed it into xargs. Then I cut the file
into chunks and copy across a few servers to run in parallel.
files:
/images/10/100.jpg
/images/10/101.jpg
/images/10/102.jpg
cat files | xargs -I XXX echo ./s3cmd cp --acl-public XXX s3://mybucketXXX
>> outfile1
split -l 1
http://media.aws.com/test.png
s3cmd replace --acl-public --add-header "Cache-control: max-age=400" s3://
media.aws.com/test.png
wget -S http://media.aws.com/test.png
You should see headers are updating without having to reupload the file. It
copy's an s3 object to a new object in s3