On 19 Dec 2015, at 13:34, Steve Loughran
mailto:ste...@hortonworks.com>> wrote:
On 18 Dec 2015, at 21:39, Andrew Or
mailto:and...@databricks.com>> wrote:
Hi Roy,
I believe Spark just gets its application ID from YARN, so you can just do
`sc.applicationId`.
If you listen for a spark start e
On 18 Dec 2015, at 21:39, Andrew Or
mailto:and...@databricks.com>> wrote:
Hi Roy,
I believe Spark just gets its application ID from YARN, so you can just do
`sc.applicationId`.
If you listen for a spark start event you get the app ID, but not the real
spark attempt ID; SPARK-11314 adds an ex
Hi Roy,
I believe Spark just gets its application ID from YARN, so you can just do
`sc.applicationId`.
-Andrew
2015-12-18 0:14 GMT-08:00 Deepak Sharma :
> I have never tried this but there is yarn client api's that you can use in
> your spark program to get the application id.
> Here is the lin
I have never tried this but there is yarn client api's that you can use in
your spark program to get the application id.
Here is the link to the yarn client java doc:
http://hadoop.apache.org/docs/r2.4.1/api/org/apache/hadoop/yarn/client/api/YarnClient.html
getApplications() is the method for your
Hello there
I have the same requirement.
I submit a streaming job with yarn-cluster mode.
If I want to shutdown this endless YARN application, I should find out the
application id by myself and use "yarn appplication -kill " to kill
the application.
Therefore, if I can get returned application
I don't think there is yarn related stuff to access in spark. Spark don't
depend on yarn.
BTW, why do you want the yarn application id ?
On Mon, Jun 22, 2015 at 11:45 PM, roy wrote:
> Hi,
>
> Is there a way to get Yarn application ID inside spark application, when
> running spark Job on YARN