Hello, I've started to use Spark 1.6.1 before I used Spark 1.5. I included the string export SPARK_CLASSPATH="/SQLDrivers/sqljdbc_4.2/enu/sqljdbc41.jar" when I launched pysparkling and it worked well. But in version 1.6.1 there is an error that it's deprecated and I had to use spark.driver.extraClassPath. OK, there is the string spark.driver.extraClassPath /SQLDrivers/sqljdbc_4.2/enu/sqljdbc41.jar in spark-defaults.conf but Spark says that there is no suitable driver for working with SQL Server.
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-extraClassPath-and-export-SPARK-CLASSPATH-tp26740.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org