Hi,
There was another related question
https://mail-archives.apache.org/mod_mbox/incubator-spark-user/201506.mbox/%3CCAJ2peNeruM2Y2Tbf8-Wiras-weE586LM_o25FsN=+z1-bfw...@mail.gmail.com%3E
Some months ago, if I remember well, using spark 1.3 + YARN + Java 8 we had
the same probem.
https://issues.
Unfortunately, no. I switched back to OpenJDK 1.7.
Didn't get a chance to dig deeper.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-YARN-using-Java-1-8-fails-tp24925p25360.html
Sent from the Apache Spark User List mailing list archive at Nabble.co
Did you get any resolution for this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-YARN-using-Java-1-8-fails-tp24925p25039.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
YARN 2.7.1 (running on the cluster) was built with Java 1.8, I assume.
Have you used the following command to retrieve / inspect logs ?
yarn logs -applicationId
Cheers
On Mon, Oct 5, 2015 at 8:41 AM, mvle wrote:
> Hi,
>
> I have successfully run pyspark on Spark 1.5.1 on YARN 2.7.1 with Java
>