use S3-Compatible Storage with spark

2015-07-17 Thread Schmirr Wurst
Hi, I wonder how to use S3 compatible Storage in Spark ? If I'm using s3n:// url schema, the it will point to amazon, is there a way I can specify the host somewhere ? - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org F

Fwd: use S3-Compatible Storage with spark

2015-07-19 Thread Schmirr Wurst
; > On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst > wrote: >> >> Hi, >> >> I wonder how to use S3 compatible Storage in Spark ? >> If I'm using s3n:// url schema, the it will point to

Re: use S3-Compatible Storage with spark

2015-07-20 Thread Schmirr Wurst
ou are using? Most of them >> > provides >> > a S3 like RestAPI endpoint for you to hit. >> > >> > Thanks >> > Best Regards >> > >> > On Fri, Jul 17, 2015 at 2:06 PM, Schmirr Wurst >> > wrote: >> >> >> >> Hi, &g

Re: use S3-Compatible Storage with spark

2015-07-21 Thread Schmirr Wurst
l Das : > You can add the jar in the classpath, and you can set the property like: > > sc.hadoopConfiguration.set("fs.s3a.endpoint","storage.sigmoid.com") > > > > Thanks > Best Regards > > On Mon, Jul 20, 2015 at 9:41 PM, Schmirr Wurst > wrote:

Re: use S3-Compatible Storage with spark

2015-07-21 Thread Schmirr Wurst
like an issue with hadoop. > > Thanks > Best Regards > > On Tue, Jul 21, 2015 at 2:31 PM, Schmirr Wurst > wrote: >> >> It seems to work for the credentials , but the endpoint is ignored.. : >> I've changed it to >> sc.hadoopConfiguration.set("fs.s

Re: use S3-Compatible Storage with spark

2015-07-22 Thread Schmirr Wurst
n when you run your spark job add --jars path/to/thejar > > ________ > From: Schmirr Wurst > Sent: Wednesday, July 22, 2015 12:06 PM > To: Thomas Demoor > Subject: Re: use S3-Compatible Storage with spark > > Hi Thomas, thanks, could you just t

Re: use S3-Compatible Storage with spark

2015-07-27 Thread Schmirr Wurst
uot;,"") sc.hadoopConfiguration.set("fs.s3a.awsSecretAccessKey","") Any Idea why it doesn't work ? 2015-07-20 18:11 GMT+02:00 Schmirr Wurst : > Thanks, that is what I was looking for... > > Any Idea where I have to store and reference the corresponding > hadoop-aws-

Re: use S3-Compatible Storage with spark

2015-07-27 Thread Schmirr Wurst
are able to access your AWS S3 with s3a now? What is the error that > you are getting when you try to access the custom storage with > fs.s3a.endpoint? > > Thanks > Best Regards > > On Mon, Jul 27, 2015 at 2:44 PM, Schmirr Wurst > wrote: >> >> I was able to acc

Re: use S3-Compatible Storage with spark

2015-07-28 Thread Schmirr Wurst
Hi recompiled and retried, now its looking like this with s3a : com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain S3n is working find, (only problem is still the endpoint) - To uns

Re: use S3-Compatible Storage with spark

2015-07-28 Thread Schmirr Wurst
efault: s3.amazonaws.com > > Thanks > Best Regards > > On Tue, Jul 28, 2015 at 1:54 PM, Schmirr Wurst > wrote: > >> Hi recompiled and retried, now its looking like this with s3a : >> com.amazonaws.AmazonClientException: Unable to load AWS credentials >> from any provider in the chain >> >> S3n is working find, (only problem is still the endpoint) >> > >