Have you found any solution to this? I am having a similar issue where my
db2jcc.jar license file is not being found. I was hoping the addjar()
method would work, but it does not seem to help.
I cannot even get the addjar syntax correct it seems...
Can I just call it inline?:
sqlContext.
You can either add jar to spark-submit as pointed out like
${SPARK_HOME}/bin/spark-submit \
--packages com.databricks:spark-csv_2.11:1.3.0 \
--jars
/home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar \
OR
Create/package a fat jar file that wil
Hi all,
I'm a bit stuck with a problem that I thought was solved in SPARK-6913 but
can't seem to get it to work.
I'm programatically adding a jar (sc.addJar(pathToJar)) after the
SparkContext is created then using the driver from the jar to load a table
through sqlContext.read.jdbc(). When I tr