Hi,
I am running the s3cmd sync s3://sub.domain.com/subfolder ~/s3backups
--skip-existing
I have 30 files in the (source) s3 bucket and in the destination (local)
folder from the initial sync, which is around 6 Gb.
It seems to be overwriting all of the files in the destination (local)
folder, as I can see when the script finishes, it states Downloaded
6,167,623,194 bytes in 385.4 seconds.
Can you please advise how to only add diffs and not the full set of 30
files, so I can avoid inbound data costs from my supplier.
Regards
Jay Skinner
Network Manager
Data Driven. Direct. Digital.
a: Level 2/627 Chapel Street, South Yarra, VIC 3141
p: (03) 9827 7790
f: (03) 9827 7858
e: jay.skin...@impactdata.com.au
w: www.impactdata.com.au
Please consider the environment before printing
------------------------------------------------------------------------------
Ridiculously easy VDI. With Citrix VDI-in-a-Box, you don't need a complex
infrastructure or vast IT resources to deliver seamless, secure access to
virtual desktops. With this all-in-one solution, easily deploy virtual
desktops for less than the cost of PCs and save 60% on VDI infrastructure
costs. Try it free! http://p.sf.net/sfu/Citrix-VDIinabox
_______________________________________________
S3tools-general mailing list
S3tools-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/s3tools-general