Got it finally, pasting it here so that it will be useful for others val conf = new SparkConf() .setJars(jarList); conf.setExecutorEnv("ORACLE_HOME", myOraHome) conf.setExecutorEnv("SPARK_JAVA_OPTS", "-Djava.library.path=/my/custom/path")
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-environment-variable-for-a-spark-job-tp3180p3323.html Sent from the Apache Spark User List mailing list archive at Nabble.com.