Hello.

Thanks for your help Dmitri.

Perhaps this is not the correct place to ask this, but maybe someone had
have the same problem.

I build a test environment with two nodes of RIAK-CS, not in cluster, and
I'm trying to move objects from one to another, to a bucket with the same
name (testbucket).
I'm using BOTO, and the copy_key method, but it fails, I guess that problem
is in the HEAD call, this call checks if the key exists in the destination
bucket before make the PUT, that copy the key.

Here is the code I'm using.

from boto.s3.key import Key
from boto.s3.connection import S3Connection
from boto.s3.connection import OrdinaryCallingFormat

apikey04='GFR3O0HFPXQ-BWSXEMAG'
secretkey04='eIiigR4Rov2O2kxuSHNW7WPoJE2KmrtMpzzqlg=='

apikey02='J0TT_C9MJPWPGHW-KEWY'
secretkey02='xcLOt3ANqyNJ0kAjP8Mxx68qr7kgyXG3eqJuMA=='
cf=OrdinaryCallingFormat()

conn04=S3Connection(aws_access_key_id=apikey04,aws_secret_access_key=secretkey04,
                      is_secure=False,host='rk04.ejemplo.com
',port=8080,calling_format=cf)

conn02=S3Connection(aws_access_key_id=apikey02,aws_secret_access_key=secretkey02,
                      is_secure=False,host='rk02.ejemplo.com
',port=8080,calling_format=cf)

bucket04=conn04.get_bucket('testbucket')
bucket02=conn02.get_bucket('testbucket')

rs04 = bucket04.list()

for k in rs04:
    print k.name
    bucket02.copy_key(k.key, bucket04, k.key)


When this script is executed it returns:

Traceback (most recent call last):
  File "s3_connect_2.py", line 38, in <module>
    bucket02.copy_key(k.key, bucket04, k.key)
  File
"/home/alberto/.virtualenvs/boto/local/lib/python2.7/site-packages/boto/s3/bucket.py",
line 888, in copy_key
    response.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 404 Not Found
<?xml version="1.0"
encoding="UTF-8"?><Error><Code>NoSuchKey</Code><Message>The specified key
does not exist.</Message><Resource>/&lt;Bucket: testbucket&gt;$



The idea is copy all keys in testbucket from rk04.ejemplo.com, to
testbucket in test02.ejemplo.com, maybe someone can help me.


Thanks a lot.


2015-11-13 17:06 GMT+01:00 Dmitri Zagidulin <dzagidu...@basho.com>:

> Hi Alberto,
>
> From what I understand, the state of the art in terms of migration of
> objects from Amazon S3 to Riak CS is -- writing migration scripts.
> Either as shell scripts (using s3cmd), or language-specific libraries like
> boto (or even just the S3 SDKs).
> And the scripts would consist of:
> 1) get a list of the buckets you want to migrate
> 2) List the keys in those buckets
> 3) Migrate each object from AWS to CS.
>
> You're right that mounting buckets as filesystems is a (distant)
> possibility, but we have not seen much successful use of those (though if
> anybody's made that work, let us know).
>
>
>
> On Thu, Nov 12, 2015 at 12:40 PM, Alberto Ayllon <aayl...@qdqmedia.com>
> wrote:
>
>> Hello.
>>
>> I'm new using Riak and Riak-cs, I have installed a Riak-cs cluster with 4
>> nodes and it works fine,
>>
>> Here is my question,  the company where I work has some buckets in Amazon
>> s3, and I would like migrate objects from these buckets to our Riak-cs
>> installation, as far as I know I can do it using S3FUSE or S3BACKER,
>> mounting buckets as a filesystem, but would like avoid mount it as
>> filesystem. I tried it with boto python library, using the copy_key method,
>> but it doesn't work.
>>
>> Has anybody try with success synchronize buckets from AS3 to Riak-CS?
>>
>> Thanks.
>>
>> P:D: Excuse for my English.
>>
>> _______________________________________________
>> riak-users mailing list
>> riak-users@lists.basho.com
>> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
>>
>>
>
_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to