Github user H4ml3t commented on the pull request:
https://github.com/apache/incubator-zeppelin/pull/605
Hi all,
I can confirm, using release 0.5.6 and external Spark 1.5.1 cdh 5.5, I must
add to zeppelin-env.sh
```ZEPPELIN_INTP_JAVA_OPTS="-Dspark.yarn.isPython=true&quo
Github user H4ml3t commented on the issue:
https://github.com/apache/zeppelin/pull/868
@bbuild11 do you have an update and can you share your "-conf" file?
Thank you in advance,
Luca
---
If your project is set up for it, you can reply to this email and have your
re
Github user H4ml3t commented on the issue:
https://github.com/apache/zeppelin/pull/1347
Actually I have this with SPARK_HOME set. Compiled with:
```bash
mvn clean package -Pspark-2.0 -Dhadoop.version=2.6.0-cdh5.7.0 -Phadoop-2.6
-Pvendor-repo -DskipTests -Pyarn -Ppyspark -Pr
Github user H4ml3t commented on the issue:
https://github.com/apache/zeppelin/pull/1347
Hi @zjffdu, no I'm still using the release source for ``Affects Version/s:
0.6.1`` compiling it with Scala 2.10. You wrote in the issue description
"``happens when SPARK_HOME is not defi
Github user H4ml3t commented on the issue:
https://github.com/apache/zeppelin/pull/1347
Sorry I was running of time too, and I decided to downgrade both Spark and
Zeppelin versions, for which I was sure everything would have worked. I suggest
to do the same.
Cheers,
Luca