I generally take my file list and feed it into xargs. Then I cut the file
into chunks and copy across a few servers to run in parallel.

files:
/images/10/100.jpg
/images/10/101.jpg
/images/10/102.jpg

cat files | xargs -I XXX echo ./s3cmd cp --acl-public XXX s3://mybucketXXX
>> outfile1
split -l 10000 outfile1

sh xaa > xaa.out&
sh xab > xab.out&
....

It depends on your file structure. You could easily use a comma separated
file and use awk to create outfile1

On Mon, Nov 8, 2010 at 5:26 PM, dani <dani...@rogers.com> wrote:

> I haven't found this in the documentation:
>
> Is there way to supply the list of files to upload in a text file? I've
> been doing this with a while loop, but each line starts a new process, so
> it's not very efficient.
>
> Thanks
>
> Dani
>
>
>
>
> ------------------------------------------------------------------------------
> The Next 800 Companies to Lead America's Growth: New Video Whitepaper
> David G. Thomson, author of the best-selling book "Blueprint to a
> Billion" shares his insights and actions to help propel your
> business during the next growth cycle. Listen Now!
> http://p.sf.net/sfu/SAP-dev2dev
> _______________________________________________
> S3tools-general mailing list
> S3tools-general@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/s3tools-general
>
>
------------------------------------------------------------------------------
The Next 800 Companies to Lead America's Growth: New Video Whitepaper
David G. Thomson, author of the best-selling book "Blueprint to a 
Billion" shares his insights and actions to help propel your 
business during the next growth cycle. Listen Now!
http://p.sf.net/sfu/SAP-dev2dev
_______________________________________________
S3tools-general mailing list
S3tools-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/s3tools-general

Reply via email to