Could you elaborate what you mean by `not working`?

> but it's not working.

For the following question, Spark expects a normal Pod YAML file.
You may want to take a look at the Apache Spark GitHub repository.

> I do not have a  sample template file

For example, the following files are used during K8s integration tests.

https://github.com/apache/spark/tree/master/resource-managers/kubernetes/integration-tests/src/test/resources

1. driver-schedule-template.yml
2. driver-template.yml
3. executor-template.yml

Dongjoon.

On Thu, Jan 2, 2025 at 12:07 PM jilani shaik <jilani2...@gmail.com> wrote:

> Hi,
>
> I am trying to run Spark on the Kubernetes cluster, but that cluster has
> certain validation to deploy any pod that is not allowing me to run my
> Spark submit.
>
> for example, I need to add liveness, readiness probes and certain security
> capability restrictions, which we usually do for all outer pods via yaml
> file.
>
> not sure how to get that in Spark submit k8s. I tried the driver and
> executor template file, but it's not working. at the same time, I do not
> have a  sample template file from the documentation except below lines
>
> --conf spark.kubernetes.driver.podTemplateFile=s3a://bucket/driver.yml
> --conf spark.kubernetes.executor.podTemplateFile=s3a://bucket/executor.yml
>
>
> Can some one provide directions how to proceed further.
>
> Thanks,
> Jilani
>

Reply via email to