I was actually looking for a way to do this today, too. We want to sync a
number of different directories to the bucket we've created for backups, and
keep the file structure intact, but also be able to exclude when
appropriate.

For example, I want to be able to send the contents of the /etc dir on
hostname "myserver" to s3://bucketwherewekeepbackups/servers/myserver/etc/,
but I want to exclude two dirs: /etc/selinux and /etc/webmin. Then we want
to sync the /home directory on myserver (and all subdirs) to
s3://bucketwherewekeepbackups/servers/myserver/home/ and so on.

Currently, it seems the only way to do that is one s3cmd script per
directory we want to sync (which is how we're currently doing it).

It would be great to be able to do something like:

s3cmd sync --skip-existing --delete-remove --keep-local-paths -f
inputargs.txt s3://s3bucketname/

Then have inputargs.txt be something like:

include: /etc/
exclude: /etc/selinux
exclude: /etc/webmin
include: /home
exclude: /home/bob

The -f would tell s3cmd which input file to use, and the --keep-local-paths
option would tell s3cmd to append the paths in the inputargs.txt file to the
destination bucket.

When using something like --keep-local-paths, the user would have to be sure
to be in the correct dir before launching the command (or add a cd to their
shell script) and be sure to use appropriate paths in their input file.

SJ


From: dani [mailto:dani...@rogers.com] 
Sent: Monday, November 08, 2010 8:20 PM
To: s3tools-general@lists.sourceforge.net
Subject: Re: [S3tools-general] Argument list in text file

Yes, but each line is a new shell process and opens and closes the
connection to the server. 
 
I'm hoping for something like the wget parameter -i, which reuses open
connections.
 
Thanks
 
Dani
On Mon, Nov 8, 2010 at 9:19 PM, Jobe Bittman <j...@opencandy.com> wrote:
I generally take my file list and feed it into xargs. Then I cut the file
into chunks and copy across a few servers to run in parallel.

files:
/images/10/100.jpg
/images/10/101.jpg
/images/10/102.jpg

cat files | xargs -I XXX echo ./s3cmd cp --acl-public XXX s3://mybucketXXX
>> outfile1
split -l 10000 outfile1

sh xaa > xaa.out&
sh xab > xab.out&
....

It depends on your file structure. You could easily use a comma separated
file and use awk to create outfile1
On Mon, Nov 8, 2010 at 5:26 PM, dani <dani...@rogers.com> wrote:
I haven't found this in the documentation:
 
Is there way to supply the list of files to upload in a text file? I've been
doing this with a while loop, but each line starts a new process, so it's
not very efficient.
 
Thanks
 
Dani


------------------------------------------------------------------------------
The Next 800 Companies to Lead America's Growth: New Video Whitepaper
David G. Thomson, author of the best-selling book "Blueprint to a 
Billion" shares his insights and actions to help propel your 
business during the next growth cycle. Listen Now!
http://p.sf.net/sfu/SAP-dev2dev
_______________________________________________
S3tools-general mailing list
S3tools-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/s3tools-general

Reply via email to