You can either add jar to spark-submit as pointed out like ${SPARK_HOME}/bin/spark-submit \ --packages com.databricks:spark-csv_2.11:1.3.0 \ --jars /home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar \
OR Create/package a fat jar file that will include the JDBC driver for MSSQL in the jar file itself ${SPARK_HOME}/bin/spark-submit \ --class "${FILE_NAME}" \ --master spark://50.140.197.217:7077 \ --executor-memory=12G \ --executor-cores=2 \ --num-executors=2 \ --files ${SPARK_HOME}/conf/log4j.properties \ ${JAR_FILE} HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* http://talebzadehmich.wordpress.com On 18 April 2016 at 17:29, chaz2505 <chaz2...@hotmail.com> wrote: > Hi all, > > I'm a bit stuck with a problem that I thought was solved in SPARK-6913 but > can't seem to get it to work. > > I'm programatically adding a jar (sc.addJar(pathToJar)) after the > SparkContext is created then using the driver from the jar to load a table > through sqlContext.read.jdbc(). When I try this I'm getting > java.lang.ClassNotFoundException: > com.microsoft.sqlserver.jdbc.SQLServerDriver (in my case I'm connecting to > a > MS SQLServer). > > I've tried this on both the shell and in an application submitted through > spark-submit. When I add the jar with --jars it works fine but my > application is meant to be a long running app that should not require the > jar to be added on application start. > > Running Class.forName(driver).newInstance does not work on the driver but > in > a map function of an RDD it does work so the jar is being added only to the > executors - shouldn't this be enough for sqlContext.read.jdbc() to work? > > I'm using Spark 1.6.1. > > Thanks > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Jdbc-connection-from-sc-addJar-failing-tp26797.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >