Hello all , my first post here so please be gentle.
I am new to AWS and S3cmd. I have a bucket that I upload videos to, now I try to copy these videos over to an EC2 instance and onto an EBS volume. Copying one file is not an issue, however copying an entire dir over is a pain. >From what I can tell, it does not really work like I expected unless I am doing something wrong.. I assumed s3cmd worked like rsync a bit. I am trying to copy a bunch of dirs from a bucket to local ebs drive, but I can never get recursive to work, it always just copies stuff to current dir and never creates the local dir. if I create the dir first and then sync or get it will go into the dir, this is a pain as I have 1000's of dirs to copy and creating each one first is a pain, unless that's the only way it works? My command I am using is this [root@ip-10-68-xx-xx 001]# s3cmd get -r --skip-existing s3://mybucket/videosFolder/001/1087 . s3://reelmoviestest/videosFolder/001/1087 -> ./1087 [1 of 1] 0 of 0 0% in 0s 0.00 B/s done So all this does is create a file named 1087 and nothing else If I do this [root@ip-10-68-xx-xx 001]# s3cmd get -r --skip-existing s3://mybucket/videosFolder/001/1087/ . It just dumps the file in current dir How would I get this dir 1087 into current dir without having to create the dir its self? Thanks. And have a great day! Rob Morin Senior Systems Administrator iLabs Inc. (514) 387-0638 Ext: 207 ilabs-email-sig
<<image001.gif>>
------------------------------------------------------------------------------ All the data continuously generated in your IT infrastructure contains a definitive record of customers, application performance, security threats, fraudulent activity, and more. Splunk takes this data and makes sense of it. IT sense. And common sense. http://p.sf.net/sfu/splunk-novd2d
_______________________________________________ S3tools-general mailing list S3tools-general@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/s3tools-general