...@cloudera.com]
Sent: Thursday, February 12, 2015 12:13 AM
To: Akhil Das
Cc: Michael Nazario; user@spark.apache.org
Subject: Re: PySpark 1.2 Hadoop version mismatch
No, "mr1" should not be the issue here, and I think that would break
other things. The OP is not using mr1.
client 4 / server 7 mea
h a spark 1.2 master and worker. It's not a good workaround,
>> so I would like to have the driver also be spark 1.2
>>
>> Michael
>> ________________
>> From: Michael Nazario
>> Sent: Wednesday, February 11, 2015 10:13 PM
>> To: user@spark.
also be spark 1.2
>
> Michael
> --
> *From:* Michael Nazario
> *Sent:* Wednesday, February 11, 2015 10:13 PM
> *To:* user@spark.apache.org
> *Subject:* PySpark 1.2 Hadoop version mismatch
>
> Hi Spark users,
>
> I seem to be having
chael
From: Michael Nazario
Sent: Wednesday, February 11, 2015 10:13 PM
To: user@spark.apache.org
Subject: PySpark 1.2 Hadoop version mismatch
Hi Spark users,
I seem to be having this consistent error which I have been trying to reproduce
and narrow down the problem. I
Hi Spark users,
I seem to be having this consistent error which I have been trying to reproduce
and narrow down the problem. I've been running a PySpark application on Spark
1.2 reading avro files from Hadoop. I was consistently seeing the following
error:
py4j.protocol.Py4JJavaError: An error