Using

- zeppeling-0.6-SNAPSHOT
[mvn clean package -Pspark-1.5 -Phadoop-2.6 -Pyarn -Ppyspark -DskipTests]

- Spark 1.5.2

1. Everything worked on zeppelin other than pyspark
when tried to execute as simple as below;
%pyspark
from os import getcwd

giving this error:
pyspark interpreter not found

2. Googling, I found out that below two environment variables need setting,
so did it
SPARK_HOME="/home/incubator/myspark"
PYTHONPATH="/home/incubator/myspark/python:/home/incubator/myspark/python/lib/py4j-0.8.2.1-src.zip"

after this,even below command wouldnt work;
sc.version

giving this error:
java.lang.ClassCastException: scala.None$ cannot be cast to java.util.Map

pyspark problem also stays there.

Please help with any thoughts on whats the proper way to make pyspark work
after fresh build of zeppelin and spark?

Regards,
Hmad

Reply via email to