I have used third methods to test success before, note that need to run%spark 
before running%dep.
First, the second methods test is not successful




Hello
My spark based map tasks needs to access third party jar files. I found below 
options to submit third party jar files to spark interpreter
1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma seprated> in 
conf/zeppelin-env.sh
2. include the statement spark.jars  <all the jar files with comma separated> 
in <spark>?conf/spark-defaults.conf
3. use the z.load("the location of jar file in the local filesystem") in 
zepelin notebook
I could test the first two and they both works fine. The third one does not 
work. Here is the snippet i use

%dep
z.reset()
z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")

Further, the import of class belongs to the above jar file is working when I 
use the statement import com.....  in zeppelin notebook. However, I get the 
class not found exception in the executor for the same class.
Any clue here would help greatly

regards
Bala


Reply via email to