t;
> Which made no sense because I also gave the worker 1gb of heap and it was
> trying to process a 4k README.md file. I'm guessing it must have tried to
> deserialize a bogus object because I was not submitting the job correctly
> (via spark-submit or this
I don't want to use YARN or Mesos, just trying the standalone spark cluster.
We need a way to do seamless submission with the API which I don't see.
To my surprise I was hit by this issue when i tried running the submit from
another machine, it is crazy that I have to submit the job from the worked
I am able to run Spark jobs and Spark Streaming jobs successfully via YARN on a
CDH cluster.
When you mean YARN isn’t quite there yet, you mean to submit the jobs
programmatically? or just in general?
On Sep 4, 2014, at 1:45 AM, Matt Chu wrote:
> https://github.com/spark-jobserver/spark-jo
it was
trying to process a 4k README.md file. I'm guessing it must have tried to
deserialize a bogus object because I was not submitting the job correctly
(via spark-submit or this spark-jobserver)?
Thanks,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/P
er
>
>
>
> *From:* Matt Chu [mailto:m...@kabam.com]
> *Sent:* Thursday, September 04, 2014 2:46 AM
> *To:* Vicky Kak
> *Cc:* user
> *Subject:* Re: Programatically running of the Spark Jobs.
>
>
>
> https://github.com/spark-jobserver/spark-jobserver
>
>
&
Hello,
Can this be used as a library from within another application?
Thanks!
Best, Oliver
From: Matt Chu [mailto:m...@kabam.com]
Sent: Thursday, September 04, 2014 2:46 AM
To: Vicky Kak
Cc: user
Subject: Re: Programatically running of the Spark Jobs.
https://github.com/spark
https://github.com/spark-jobserver/spark-jobserver
Ooyala's Spark jobserver is the current de facto standard, IIUC. I just
added it to our prototype stack, and will begin trying it out soon. Note
that you can only do standalone or Mesos; YARN isn't quite there yet.
(The repo just moved from https
I have been able to submit the spark jobs using the submit script but I
would like to do it via code.
I am unable to search anything matching to my need.
I am thinking of using org.apache.spark.deploy.SparkSubmit to do so, may be
have to write some utility that passes the parameters required for th