Re: running Spark-JobServer in eclipse

2018-03-04 Thread Jörn Franke
I recommend to run it with your unit tests executed with your build tool. There is no need to have it in the ide running in the background. > On 3. Mar 2018, at 17:57, sujeet jog wrote: > > Is there a way to run Spark-JobServer in eclipse ?.. any pointers in this

running Spark-JobServer in eclipse

2018-03-03 Thread sujeet jog
Is there a way to run Spark-JobServer in eclipse ?.. any pointers in this regard ?

Re: spark jobserver

2017-03-05 Thread Noorul Islam K M
A better forum would be https://groups.google.com/forum/#!forum/spark-jobserver or https://gitter.im/spark-jobserver/spark-jobserver Regards, Noorul Madabhattula Rajesh Kumar writes: > Hi, > > I am getting below an exception when I start the job-server > > ./server_start.sh

spark jobserver

2017-03-05 Thread Madabhattula Rajesh Kumar
Hi, I am getting below an exception when I start the job-server ./server_start.sh: line 41: kill: (11482) - No such process Please let me know how to resolve this error Regards, Rajesh

Spark 2.0.2 with Spark JobServer

2016-12-07 Thread Jose Carlos Guevara Turruelles
Hi, I'm working wiht the latest version of Spark JobServer together with Spark 2.0.2. I'm able to do almost all my needs but there is only one noisy thing. I have placed a hive-site.xml to specify a connection to my mysql db so I can have the metastore_db on mysql, that's w

problem deploying spark-jobserver on CentOS

2016-11-16 Thread Reza zade
Hi I'm going to deploy jobserver on my CentOS (spark is installed with cdh5.7). I'm using oracle jdk1.8, sbt-0.13.13, spark-1.6.0 and jobserver-0.6.2. When I run sbt command (after running sbt publish-local) I encountered the bellow message : [cloudera@quickstart spark-jobserver]$

Re: installing spark-jobserver on cdh 5.7 and yarn

2016-11-09 Thread Noorul Islam K M
Reza zade writes: > Hi > > I have set up a cloudera cluster and work with spark. I want to install > spark-jobserver on it. What should I do? Maybe you should send this to spark-jobserver mailing list. https://github.com/spark-jobserver/spark-jobserver#contact Thanks and Re

installing spark-jobserver on cdh 5.7 and yarn

2016-11-09 Thread Reza zade
Hi I have set up a cloudera cluster and work with spark. I want to install spark-jobserver on it. What should I do?

Re: Sharing HiveContext in Spark JobServer / getOrCreate

2016-01-25 Thread Deenar Toraskar
On 25 January 2016 at 21:09, Deenar Toraskar < deenar.toras...@thinkreactive.co.uk> wrote: > No I hadn't. This is useful, but in some cases we do want to share the > same temporary tables between jobs so really wanted a getOrCreate > equivalent on HIveContext. > > Deenar > > > > On 25 January 2016

Re: Sharing HiveContext in Spark JobServer / getOrCreate

2016-01-25 Thread Ted Yu
Have you noticed the following method of HiveContext ? * Returns a new HiveContext as new session, which will have separated SQLConf, UDF/UDAF, * temporary tables and SessionState, but sharing the same CacheManager, IsolatedClientLoader * and Hive client (both of execution and metadata) w

Sharing HiveContext in Spark JobServer / getOrCreate

2016-01-25 Thread Deenar Toraskar
Hi I am using a shared sparkContext for all of my Spark jobs. Some of the jobs use HiveContext, but there isn't a getOrCreate method on HiveContext which will allow reuse of an existing HiveContext. Such a method exists on SQLContext only (def getOrCreate(sparkContext: SparkContext): SQLContext).

How can I use dynamic resource allocation option in spark-jobserver?

2015-10-13 Thread JUNG YOUSUN
Hi all, I have some questions about spark -jobserver. I deployed a spark-jobserver in yarn-client mode using docker. I’d like to use dynamic resource allocation option for yarn in spark-jobserver. How can I add this option? And when will it be support 1.5.x version ? (https://hub.docker.com/r

Re: Cassandra Connection Issue with Spark-jobserver

2015-04-27 Thread Anand
I was able to fix the issues by providing right version of cassandra-all and thrift libraries -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cassandra-Connection-Issue-with-Spark-jobserver-tp22587p22664.html Sent from the Apache Spark User List mailing

Re: Cassandra Connection Issue with Spark-jobserver

2015-04-27 Thread Noorul Islam K M
productId, saleCount) => { > val outColFamKey = Map("prod_id" -> ByteBufferUtil.bytes(productId)) > val outKey: java.util.Map[String, ByteBuffer] = outColFamKey > var outColFamVal = new ListBuffer[ByteBuffer] > outColFamVal += ByteBufferUt

Cassandra Connection Issue with Spark-jobserver

2015-04-21 Thread Anand
s(saleCount) val outVal: java.util.List[ByteBuffer] = outColFamVal (outKey, outVal) } } casoutputCF.saveAsNewAPIHadoopFile( KeySpace, classOf[java.util.Map[String, ByteBuffer]], classOf[java.util.List[ByteBuffer]], classOf[CqlOutputForm

Re: Setup Spark jobserver for Spark SQL

2015-04-02 Thread Daniel Siegmann
You shouldn't need to do anything special. Are you using a named context? I'm not sure those work with SparkSqlJob. By the way, there is a forum on Google groups for the Spark Job Server: https://groups.google.com/forum/#!forum/spark-jobserver On Thu, Apr 2, 2015 at 5:10 AM, Harika wr

Setup Spark jobserver for Spark SQL

2015-04-02 Thread Harika
Hi, I am trying to Spark Jobserver( https://github.com/spark-jobserver/spark-jobserver <https://github.com/spark-jobserver/spark-jobserver> ) for running Spark SQL jobs. I was able to start the server but when I run my application(my Scala class which extends SparkSqlJob), I am getti

Re: How to pass parameters to a spark-jobserver Scala class?

2015-03-16 Thread Sasi
Sorry for the long silence. We are able to 1. Pass parameters from Vaadin (Java Framework) to spark-jobserver using HttpURLConnection POST method. 2. Receive filtered (based on passed parameters) RDD results from spark-jobserver using HttpURLConnection GET method. 3. Finally, showing the results

Re: How to pass parameters to a spark-jobserver Scala class?

2015-02-19 Thread Sasi
Thanks Vasu. Let me get back to you once I am done with trials. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-pass-parameters-to-a-spark-jobserver-Scala-class-tp21671p21732.html Sent from the Apache Spark User List mailing list archive at

Re: How to pass parameters to a spark-jobserver Scala class?

2015-02-19 Thread Vasu C
Twitter. Regards, Vasu C -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-pass-parameters-to-a-spark-jobserver-Scala-class-tp21671p21727.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: How to pass parameters to a spark-jobserver Scala class?

2015-02-18 Thread Sasi
Thank you very much Vasu. Let me add some more points to my question. We are developing a Java program for connection spark-jobserver to Vaadin (Java framework). Following is the sample code I wrote for connecting both (the code works fine): / URL url = null; HttpURLConnection connection = null

Re: How to pass parameters to a spark-jobserver Scala class?

2015-02-18 Thread Vasu C
-a-spark-jobserver-Scala-class-tp21671p21695.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h

Re: How to pass parameters to a spark-jobserver Scala class?

2015-02-17 Thread Vasu C
Hi Sasi, To pass parameters to spark-jobserver use " curl -d "input.string = a b c a b see" " and in Job server class use config.getString("input.string"). You can pass multiple parameters like starttime,endtime etc and use config.getString("") to get

Re: How to define SparkContext with Cassandra connection for spark-jobserver?

2015-01-16 Thread Sasi
Thank you Abhishek. The code works. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-define-SparkContext-with-Cassandra-connection-for-spark-jobserver-tp21119p21184.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: How to define SparkContext with Cassandra connection for spark-jobserver?

2015-01-15 Thread abhishek
-Cassandra-connection-for-spark-jobserver-tp21119p21162.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user

[spark-jobserver] Submit Job in yarn-cluster mode (?)

2015-01-14 Thread Pietro Gentile
Hi all, I'm able to submit spark jobs through spark-jobserver. But this allows to use spark only in yarn-client mode. I want to use spark also in yarn-cluster mode but jobserver does not allow it, like says in the README file https://github.com/spark-jobserver/spark-jobserver. Could you

How to define SparkContext with Cassandra connection for spark-jobserver?

2015-01-13 Thread Sasi
Dear All, For our requirement, we need to define a SparkContext with SparkConf which has Cassandra connection details. And this SparkContext need to be shared for subsequent runJobs and through out the application. So, How to define SparkContext with Cassandra connection for spark-jobserver

Re: Removing JARs from spark-jobserver

2015-01-12 Thread Fernando O.
21089&i=0>> wrote: > >> Thank you Abhishek. That works. >> >> -- >> If you reply to this email, your message will be added to the >> discussion below: >> >> http://apache-spark-user-list.1001560.n3.nabble.com/Removin

Re: Removing JARs from spark-jobserver

2015-01-11 Thread abhishek
discussion > below: > > http://apache-spark-user-list.1001560.n3.nabble.com/Removing-JARs-from-spark-jobserver-tp21081p21084.html > To start a new topic under Apache Spark User List, email > ml-node+s1001560n...@n3.nabble.com > To unsubscribe from Apache Spark User List, click he

Re: Removing JARs from spark-jobserver

2015-01-11 Thread Sasi
Thank you Abhishek. That works. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Removing-JARs-from-spark-jobserver-tp21081p21084.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Removing JARs from spark-jobserver

2015-01-10 Thread abhishek
There is path /tmp/spark-jobserver/file where all the jar are kept by default. probably deleting from there should work On 11 Jan 2015 12:51, "Sasi [via Apache Spark User List]" < ml-node+s1001560n21081...@n3.nabble.com> wrote: > How to remove submitted JARs f

Removing JARs from spark-jobserver

2015-01-10 Thread Sasi
How to remove submitted JARs from spark-jobserver? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Removing-JARs-from-spark-jobserver-tp21081.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Set EXTRA_JAR environment variable for spark-jobserver

2015-01-09 Thread Sasi
We are able to resolve *SparkException: Job aborted due to stage failure: All masters are unresponsive! Giving up* as well. Spark-jobserver working fine now and need to experiment more. Thank you guys. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Set

Re: Set EXTRA_JAR environment variable for spark-jobserver

2015-01-09 Thread Sasi
/scala-2.10/spark-jobserver-examples_2.10-1.0.0.jar localhost:8090/jars/sparking* command to upload as mentioned in https://github.com/fedragon/spark-jobserver-examples link. We done some samples earlier for connecting Apache Cassandra to spark using Scala language. Initially, we faced same

Re: Set EXTRA_JAR environment variable for spark-jobserver

2015-01-08 Thread Sasi
Thank you Pankaj. We are able to create the Uber JAR (very good to bind all dependency JARs together) and run it on spark-jobserver. One step better than what we are. However, now facing *SparkException: Job aborted due to stage failure: All masters are unresponsive! Giving up*. We may need to

Re: Set EXTRA_JAR environment variable for spark-jobserver

2015-01-06 Thread Todd Nist
joda-time-2.3.jar" ] } Now post the context to the job server: radtech:spark-jobserver-example$ curl -d src/main/resources/spark.context-settings.config -X POST 'localhost:8090/contexts/cassJob-context' Then execute your job: curl --data-binary @target/scala-2.10/spark-jobse

Re: Set EXTRA_JAR environment variable for spark-jobserver

2015-01-06 Thread bchazalet
its own as a regular spark app, without using jobserver. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Set-EXTRA-JAR-environment-variable-for-spark-jobserver-tp20989p20998.html Sent from the Apache Spar

Re: Set EXTRA_JAR environment variable for spark-jobserver

2015-01-06 Thread Sasi
"jobserver test demo") .setMaster("local[4]") .setJars(Seq("C:/spark-jobserver/lib/spark-cassandra-connector_2.10-1.1.0-alpha3.jar")) Am I missing something? Meanwhile, I will try for Pankaj's reply of using uber jar. --

Re: Set EXTRA_JAR environment variable for spark-jobserver

2015-01-06 Thread Akhil Das
Or you can use: sc.addJar("/path/to/your/datastax.jar") Thanks Best Regards On Tue, Jan 6, 2015 at 5:53 PM, bchazalet wrote: > I don't know much about spark-jobserver, but you can set jars > programatically > using the method setJars on SparkConf. Looking at your cod

Re: Set EXTRA_JAR environment variable for spark-jobserver

2015-01-06 Thread Pankaj Narang
Skype pankaj.narang -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Set-EXTRA-JAR-environment-variable-for-spark-jobserver-tp20989p20992.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Set EXTRA_JAR environment variable for spark-jobserver

2015-01-06 Thread bchazalet
I don't know much about spark-jobserver, but you can set jars programatically using the method setJars on SparkConf. Looking at your code it seems that you're importing classes from com.datastax.spark.connector._ to load data from cassandra, so you may need to add that datastax j

Set EXTRA_JAR environment variable for spark-jobserver

2015-01-06 Thread Sasi
We are trying to use spark-jobserver for one of our requirement. We referred *https://github.com/fedragon/spark-jobserver-examples* and modified little to match our requirement as below - /** ProductionRDDBuilder.scala ***/ package sparking package jobserver // Import required libraries

Re: Trying to make spark-jobserver work with yarn

2015-01-01 Thread Fernando O.
Thanks Akhil, that will help a lot ! It turned out that spark-jobserver does not work in "development mode" but if you deploy a server it works (looks like the dependencies when running jobserver from sbt are not right) On Thu, Jan 1, 2015 at 5:22 AM, Akhil Das wrote: >

Re: Trying to make spark-jobserver work with yarn

2015-01-01 Thread Akhil Das
ndo O. wrote: > >> Hi all, >> I'm investigating spark for a new project and I'm trying to use >> spark-jobserver because... I need to reuse and share RDDs and from what I >> read in the forum that's the "standard" :D >> >> Turns out tha

Re: Trying to make spark-jobserver work with yarn

2014-12-31 Thread Fernando O.
g to use > spark-jobserver because... I need to reuse and share RDDs and from what I > read in the forum that's the "standard" :D > > Turns out that spark-jobserver doesn't seem to work on yarn, or at least > it does not on 1.1.1 > > My config is spark 1.1.1 (mo

Trying to make spark-jobserver work with yarn

2014-12-30 Thread Fernando O.
Hi all, I'm investigating spark for a new project and I'm trying to use spark-jobserver because... I need to reuse and share RDDs and from what I read in the forum that's the "standard" :D Turns out that spark-jobserver doesn't seem to work on yarn, or at least i

Re: Need help for Spark-JobServer setup on Maven (for Java programming)

2014-12-30 Thread Sasi
Thanks Abhishek. We are good know with an answer to try. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20906.html Sent from the Apache Spark User List mailing list archive at

Re: Need help for Spark-JobServer setup on Maven (for Java programming)

2014-12-30 Thread abhishek
k-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20902.html > To start a new topic under Apache Spark User List, email > ml-node+s1001560n...@n3.nabble.com > To unsubscribe from Apache Spark User List, click here > <ht

Re: Need help for Spark-JobServer setup on Maven (for Java programming)

2014-12-30 Thread Sasi
/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20902.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional

Re: Need help for Spark-JobServer setup on Maven (for Java programming)

2014-12-30 Thread abhishek
List]" < ml-node+s1001560n20898...@n3.nabble.com> wrote: > > The reason being, we had Vaadin (Java Framework) application which displays data from Spark RDD, which in turn gets data from Cassandra. As we know, we need to use Maven for building Spark API in Java. > > We tested t

Re: Need help for Spark-JobServer setup on Maven (for Java programming)

2014-12-30 Thread Sasi
The reason being, we had Vaadin (Java Framework) application which displays data from Spark RDD, which in turn gets data from Cassandra. As we know, we need to use Maven for building Spark API in Java. We tested the spark-jobserver using SBT and able to run it. However, for our requirement, we

Re: Need help for Spark-JobServer setup on Maven (for Java programming)

2014-12-30 Thread abhishek
elaboration? > > Sasi > > > If you reply to this email, your message will be added to the discussion below: > http://apache-spark-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20896.html > T

Re: Need help for Spark-JobServer setup on Maven (for Java programming)

2014-12-30 Thread Sasi
Does my question make sense or required some elaboration? Sasi -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20896.html Sent from the Apache Spark User List mailing list archive at

Need help for Spark-JobServer setup on Maven (for Java programming)

2014-12-24 Thread Sasi
Dear All, We are trying to share RDDs across different sessions of same Web application (Java). We need to share single RDD between those sessions. As we understand from some posts, it is possible through Spark-JobServer. Is there any guidelines you can provide to setup Spark-JobServer for Maven

Spark-jobserver for java apps

2014-10-20 Thread Tomer Benyamini
Hi, I'm working on the problem of remotely submitting apps to the spark master. I'm trying to use the spark-jobserver project (https://github.com/ooyala/spark-jobserver) for that purpose. For scala apps looks like things are working smoothly, but for java apps, I have an issue with im

Spark-JobServer moving to a new location

2014-08-21 Thread Evan Chan
managing your Spark jobs and job history and status. In order to make sure the project can continue to move forward independently, new features developed and contributions merged, we are moving the project to a new github organization. The new location is: https://github.com/spark-jobserver/spark

python project like spark-jobserver?

2014-07-29 Thread Chris Grier
I'm looking for something like the ooyala spark-jobserver ( https://github.com/ooyala/spark-jobserver) that basically manages a SparkContext for use from a REST or web application environment, but for python jobs instead of scala. Has anyone written something like this? Looking for a proje

Re: Integrate Spark Editor with Hue for source compiled installation of spark/spark-jobServer

2014-07-02 Thread Sunita Arvind
That's good to know. I will try it out. Thanks Romain On Friday, June 27, 2014, Romain Rigaux wrote: > So far Spark Job Server does not work with Spark 1.0: > https://github.com/ooyala/spark-jobserver > > So this works only with Spark 0.9 currently: > > http://gethue.com

Re: Integrate Spark Editor with Hue for source compiled installation of spark/spark-jobServer

2014-06-27 Thread Romain Rigaux
So far Spark Job Server does not work with Spark 1.0: https://github.com/ooyala/spark-jobserver So this works only with Spark 0.9 currently: http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/ Romain Romain On Tue, Jun 24, 2014 at 9:04 AM

Integrate Spark Editor with Hue for source compiled installation of spark/spark-jobServer

2014-06-24 Thread Sunita Arvind
app/ Now I am trying to add the Spark editor to Hue. AFAIK, this requires git clone https://github.com/ooyala/spark-jobserver.git cd spark-jobserver sbt re-start This was successful after lot of struggle with the proxy settings. However, is this the job Server itself? Will that mean the job Server