Thank you, that looks promising as well.
* Marshall
From: Yinan Li
Sent: Sunday, April 5, 2020 3:49 PM
To: Marshall Markham
Cc: user
Subject: Re: spark-submit exit status on k8s
Not sure if you are aware of this new feature in Airflow
https://issues.apache.org/jira/browse/AIRFLOW-6542
11:25 AM
To: Marshall Markham ; user
Subject: Re: spark-submit exit status on k8s
Another, simpler solution that I just thought of: just add an operation at the
end of your Spark program to write an empty file somewhere, with filename
SUCCESS for example. Add a stage to your AirFlow graph to
ty low. Is there any
> discussion of picking up this work in the near future?
>
>
>
> Thanks,
>
> Marshall
>
>
>
> *From:* Masood Krohy
>
> *Sent:* Friday, April 3, 2020 9:34 PM
> *To:* Marshall Markham
> ; user
>
> *Subject:* Re: spark
*Subject:* Re: spark-submit exit status on k8s
While you wait for a fix on that JIRA ticket, you may be able to add
an intermediary step in your AirFlow graph, calling Spark's REST API
after submitting the job, and dig into the actual status of the
application, and make a success/fail decis
ham ; user
*Subject:* Re: spark-submit exit status on k8s
While you wait for a fix on that JIRA ticket, you may be able to add
an intermediary step in your AirFlow graph, calling Spark's REST API
after submitting the job, and dig into the actual status of the
application, and make
discussion of picking
up this work in the near future?
Thanks,
Marshall
From: Masood Krohy
Sent: Friday, April 3, 2020 9:34 PM
To: Marshall Markham ; user
Subject: Re: spark-submit exit status on k8s
While you wait for a fix on that JIRA ticket, you may be able to add an
intermediary step in your
While you wait for a fix on that JIRA ticket, you may be able to add an
intermediary step in your AirFlow graph, calling Spark's REST API after
submitting the job, and dig into the actual status of the application,
and make a success/fail decision accordingly. You can make repeated
calls in a l