I get this error when i run it from IDE
***
Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Master removed our application: FAILED
at org.apache.spark.scheduler.DAG
I don't want to use YARN or Mesos, just trying the standalone spark cluster.
We need a way to do seamless submission with the API which I don't see.
To my surprise I was hit by this issue when i tried running the submit from
another machine, it is crazy that I have to submit the job from the worked
I am able to run Spark jobs and Spark Streaming jobs successfully via YARN on a
CDH cluster.
When you mean YARN isn’t quite there yet, you mean to submit the jobs
programmatically? or just in general?
On Sep 4, 2014, at 1:45 AM, Matt Chu wrote:
> https://github.com/spark-jobserver/spark-jo
Ahh - that probably explains an issue I am seeing. I am a brand new user and
I tried running the SimpleApp class that is on the Quick Start page
(http://spark.apache.org/docs/latest/quick-start.html).
When I use conf.setMaster("local") then I can run the class directly from my
IDE. But when I tr
er
>
>
>
> *From:* Matt Chu [mailto:m...@kabam.com]
> *Sent:* Thursday, September 04, 2014 2:46 AM
> *To:* Vicky Kak
> *Cc:* user
> *Subject:* Re: Programatically running of the Spark Jobs.
>
>
>
> https://github.com/spark-jobserver/spark-jobserver
>
>
&
Hello,
Can this be used as a library from within another application?
Thanks!
Best, Oliver
From: Matt Chu [mailto:m...@kabam.com]
Sent: Thursday, September 04, 2014 2:46 AM
To: Vicky Kak
Cc: user
Subject: Re: Programatically running of the Spark Jobs.
https://github.com/spark
https://github.com/spark-jobserver/spark-jobserver
Ooyala's Spark jobserver is the current de facto standard, IIUC. I just
added it to our prototype stack, and will begin trying it out soon. Note
that you can only do standalone or Mesos; YARN isn't quite there yet.
(The repo just moved from https