Or you can use:

sc.addJar("/path/to/your/datastax.jar")


Thanks
Best Regards

On Tue, Jan 6, 2015 at 5:53 PM, bchazalet <bchaza...@companywatch.net>
wrote:

> I don't know much about spark-jobserver, but you can set jars
> programatically
> using the method setJars on SparkConf. Looking at your code it seems that
> you're importing classes from  com.datastax.spark.connector._ to load data
> from cassandra, so you may need to add that datastax jar to your SparkConf:
>
> val conf = new SparkConf(true).set("spark.cassandra.connection.host",
> "127.0.0.1")
>                                         .setAppName("jobserver test demo")
>                                         .setMaster("local[4]")
>
> .setJars(Seq("path/to/your/datastax/jar"))
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Set-EXTRA-JAR-environment-variable-for-spark-jobserver-tp20989p20990.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to