Hi, resolved. root cause:
I've recompiled zeppelin with spark 2.11, used spark 2.0 complied for scala
2.11 but external artifacts were complied for scala 2.10
I did provide correct external artifacts and Zeppelin started to work.
2017-06-26 22:49 GMT+02:00 Serega Sheypak :
> I tried all approache
I tried all approaches mentioned here:
https://zeppelin.apache.org/docs/latest/interpreter/spark.html#2-loading-spark-properties
1. conf
2. SPARK_SUBMIT_OPTIONS
3. add as artifacts using intepreter config
4. add using spark.dep
All lead to NPE
What can I try next?
2017-06-26 22:37 GMT+02:00 S
Ok, seems like something wrong when you try to use deps. I was able run
simple spark job w/o third party dependecies.
Zeppelin always throw NPE when you try to use local files using %spark.dep
or spark interpreter conf (there is an option to set local file).
Did anyone make it work?
2017-06-26 21:
Hi, I'm getting strange NPE w/o any obvious reason.
My notebook contains two paragraphs:
res0: org.apache.zeppelin.dep.Dependency =
org.apache.zeppelin.dep.Dependency@6ce5acd
%spark.dep z.load("some-local-jar.jar")
and
import com.SuperClass
// bla-bla
val features = sc.sequenceFile[NullWritab