Hi Roy,

I believe Spark just gets its application ID from YARN, so you can just do
`sc.applicationId`.

-Andrew

2015-12-18 0:14 GMT-08:00 Deepak Sharma <deepakmc...@gmail.com>:

> I have never tried this but there is yarn client api's that you can use in
> your spark program to get the application id.
> Here is the link to the yarn client java doc:
>
> http://hadoop.apache.org/docs/r2.4.1/api/org/apache/hadoop/yarn/client/api/YarnClient.html
> getApplications() is the method for your purpose here.
>
> Thanks
> Deepak
>
>
> On Fri, Dec 18, 2015 at 1:31 PM, Kyle Lin <kylelin2...@gmail.com> wrote:
>
>> Hello there
>>
>> I have the same requirement.
>>
>> I submit a streaming job with yarn-cluster mode.
>>
>> If I want to shutdown this endless YARN application, I should find out
>> the application id by myself and use "yarn appplication -kill <app_id>" to
>> kill the application.
>>
>> Therefore, if I can get returned application id in my client program, it
>> will be easy for me to kill YARN application from my client program.
>>
>> Kyle
>>
>>
>>
>> 2015-06-24 13:02 GMT+08:00 canan chen <ccn...@gmail.com>:
>>
>>> I don't think there is yarn related stuff to access in spark.  Spark
>>> don't depend on yarn.
>>>
>>> BTW, why do you want the yarn application id ?
>>>
>>> On Mon, Jun 22, 2015 at 11:45 PM, roy <rp...@njit.edu> wrote:
>>>
>>>> Hi,
>>>>
>>>>   Is there a way to get Yarn application ID inside spark application,
>>>> when
>>>> running spark Job on YARN ?
>>>>
>>>> Thanks
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Yarn-application-ID-for-Spark-job-on-Yarn-tp23429.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>>
>>
>
>
> --
> Thanks
> Deepak
> www.bigdatabig.com
> www.keosha.net
>

Reply via email to