We are able to resolve *SparkException: Job aborted due to stage failure: All
masters are unresponsive! Giving up* as well. Spark-jobserver working fine
now and need to experiment more.
Thank you guys.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Set-EXT
Boris,
Yes, as you mentioned, we are creating a new SparkContext for our Job. The
reason being, to define Apache Cassandra connection using SparkConf. We
hope, this also should work.
For uploading JAR, we followed
(1) Package JAR using *sbt package* command
(2) Use *curl --data-binary
@target/s
Thank you Pankaj. We are able to create the Uber JAR (very good to bind all
dependency JARs together) and run it on spark-jobserver. One step better
than what we are.
However, now facing *SparkException: Job aborted due to stage failure: All
masters are unresponsive! Giving up*. We may need to rai
*@Sasi*
You should be able to create a job something like this:
package io.radtech.spark.jobserver
import java.util.UUID
import org.apache.spark.{ SparkConf, SparkContext }
import org.apache.spark.rdd.RDD
import org.joda.time.DateTime
import com.datastax.spark.connector.types.TypeConverter
impor
It does not look like you're supposed to fiddle with the SparkConf and even
SparkContext in a 'job' (again, I don't know much about jobserver), as
you're given a SparkContext as parameter in the build method.
I guess jobserver initialises the SparkConf and SparkContext itself when it
first starts,
Boris,
Thank you for your suggestion. I used following code and still facing the
same issue -
val conf = new SparkConf(true).set("spark.cassandra.connection.host",
"127.0.0.1")
.setAppName("jobserver test demo")
.set
Or you can use:
sc.addJar("/path/to/your/datastax.jar")
Thanks
Best Regards
On Tue, Jan 6, 2015 at 5:53 PM, bchazalet
wrote:
> I don't know much about spark-jobserver, but you can set jars
> programatically
> using the method setJars on SparkConf. Looking at your code it seems that
> you're i
I suggest to create uber jar instead.
check my thread for the same
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-td20926.html
Regards
-Pankaj
Linkedin
https://www.linkedin.com/profile/view?id=171566646
S
I don't know much about spark-jobserver, but you can set jars programatically
using the method setJars on SparkConf. Looking at your code it seems that
you're importing classes from com.datastax.spark.connector._ to load data
from cassandra, so you may need to add that datastax jar to your SparkCo
oClassDefFoundError",
"message": "Failed to create named RDD 'production'"* error.
As per our knowledge, above problem must be related to classpath JARs during
runtime. In *https://github.com/spark-jobserver/spark-jobserver* link, it
has been mentione
10 matches
Mail list logo