How do I use it? I'm accessing s3a from Spark's textFile API.
On Tue, May 31, 2016 at 7:16 AM, Deepak Sharma
wrote:
> Hi Mayuresh
> Instead of s3a , have you tried the https:// uri for the same s3 bucket?
>
> HTH
> Deepak
>
> On Tue, May 31, 2016 at 4:
@gmail.com> wrote:
>
>> Hi,
>>
>> Is your spark cluster running in EMR or via self created SPARK cluster
>> using EC2 or from a local cluster behind firewall? What is the SPARK
>> version you are using?
>>
>> Regards,
>> Gourav Sengupta
On Tue, May 31, 2016 at 5:29 AM, Steve Loughran
wrote:
> which s3 endpoint?
>
>
I have tried both s3.amazonaws.com and s3-external-1.amazonaws.com.
>
>
> On 29 May 2016, at 22:55, Mayuresh Kunjir wrote:
>
> I'm running into permission issues while accessing dat
suggested by Teng Qiu above, will update
soon.
> On Sun, May 29, 2016 at 2:55 PM, Mayuresh Kunjir
> wrote:
>
>> I'm running into permission issues while accessing data in S3 bucket
>> stored using s3a file system from a local Spark cluster. Has anyone found
>>
I'm running into permission issues while accessing data in S3 bucket stored
using s3a file system from a local Spark cluster. Has anyone found success
with this?
My setup is:
- Spark 1.6.1 compiled against Hadoop 2.7.2
- aws-java-sdk-1.7.4.jar and hadoop-aws-2.7.2.jar in the classpath
- Spark's Ha
fHeaptrue
spark.memory.offHeap.size1024M
I am not aware of how the config manager in Spark works. But I believe
there is an easy fix for this. Could you suggest a change?
~Mayuresh
On Mon, Dec 21, 2015 at 1:46 PM, Mayuresh Kunjir
wrote:
> Thanks Ted. That stack trace is from 1.5.1 build
> Thanks
>
> On Thu, Dec 17, 2015 at 5:04 PM, Mayuresh Kunjir
> wrote:
>
>> I am testing a simple Sort program written using Dataframe APIs. When I
>> enable spark.unsafe.offHeap, the output stage fails with a NPE. The
>> exception when run on spark-1.5.1 is copied
Any intuition on this?
~Mayuresh
On Thu, Dec 17, 2015 at 8:04 PM, Mayuresh Kunjir
wrote:
> I am testing a simple Sort program written using Dataframe APIs. When I
> enable spark.unsafe.offHeap, the output stage fails with a NPE. The
> exception when run on spark-1.5.1 is cop
I am testing a simple Sort program written using Dataframe APIs. When I
enable spark.unsafe.offHeap, the output stage fails with a NPE. The
exception when run on spark-1.5.1 is copied below.
Job aborted due to stage failure: Task 23 in stage 3.0 failed 4 times, most
recent failure: Lost task 23.