Hi Team,

I am trying to add a shutdown hook with the pyspark script using `*atexit*`.
However, it seems like whenever I send a SIGTERM to the spark-submit
process, it triggers the JVM shutdown hook first which results in
terminating the spark context.

I didn't understand in what order the python and JVM process terminates.
Could you help me register a custom shutdown hook in pyspark before the Spark's
native shutdown hook
<https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkContext.scala#L695>
 ?
Thanks,
Aarushi

Reply via email to