Hi all, I'm a bit stuck with a problem that I thought was solved in SPARK-6913 but can't seem to get it to work.
I'm programatically adding a jar (sc.addJar(pathToJar)) after the SparkContext is created then using the driver from the jar to load a table through sqlContext.read.jdbc(). When I try this I'm getting java.lang.ClassNotFoundException: com.microsoft.sqlserver.jdbc.SQLServerDriver (in my case I'm connecting to a MS SQLServer). I've tried this on both the shell and in an application submitted through spark-submit. When I add the jar with --jars it works fine but my application is meant to be a long running app that should not require the jar to be added on application start. Running Class.forName(driver).newInstance does not work on the driver but in a map function of an RDD it does work so the jar is being added only to the executors - shouldn't this be enough for sqlContext.read.jdbc() to work? I'm using Spark 1.6.1. Thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Jdbc-connection-from-sc-addJar-failing-tp26797.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org