Hi

I am trying to run spark submit on kubernetes. I am able to achieve the
desired results in a way that driver and executors are getting launched as
per the given configuration and my job is able to run successfully.

*But even after job completion spark driver pod is always in Running state
and none of the executor pods are getting killed whereas when I run a
simple SparkPi application to test it with the same image executors are
getting killed and the driver shows the status as Completed.*

Can someone please guide me on this issue.

Regards
Manish Gupta

Reply via email to