Re: HDFS to S3 copy issues

2012-07-06 Thread Momina Khan
hi hdfs is running on just one node and i get the same connection refused error no matter if i try with the node's private DNS or with localhost. i do have 127.0.0.1 localhost in my /etc/hosts file thanks in advance! momina On Fri, Jul 6, 2012 at 12:22 PM, feng lu wrote: > hi Momina > > m

Re: HDFS to S3 copy issues

2012-07-06 Thread feng lu
hi Momina maybe the problem is your DNS Resolution. You must have IP hostname enteries if all nodes in /etc/hosts file. like this 127.0.0.1 localhost On Fri, Jul 6, 2012 at 2:49 PM, Momina Khan wrote: hi Ivan, > > i have tried with both ports 9000 and 9001 i get the same error dump ... > > be

Re: HDFS to S3 copy issues

2012-07-06 Thread Nitin Pawar
you may want to try following command instead of using hdfs try hftp hadoop -i -ppgu -log /tmp/mylog -m 20 distcp hftp://servername:port/path (hdfs://target.server:port/path | s3://id:sercret@domain) On Fri, Jul 6, 2012 at 12:19 PM, Momina Khan wrote: > hi Ivan, > > i have tried with both por

Re: HDFS to S3 copy issues

2012-07-05 Thread Momina Khan
hi Ivan, i have tried with both ports 9000 and 9001 i get the same error dump ... best momina On Fri, Jul 6, 2012 at 11:01 AM, Ivan Mitic wrote: > Hi Momina, > > Could it be that you misspelled the port in your source path, you mind > trying with: hdfs://10.240.113.162:9000/data/ > > Ivan > >

RE: HDFS to S3 copy issues

2012-07-05 Thread Ivan Mitic
Hi Momina, Could it be that you misspelled the port in your source path, you mind trying with: hdfs://10.240.113.162:9000/data/ Ivan -Original Message- From: Momina Khan [mailto:momina.a...@gmail.com] Sent: Thursday, July 05, 2012 10:30 PM To: common-dev@hadoop.apache.org Subject: HDF