Hi,
I want to deploy Spark client in a Kubernetes container. Further on , I want to 
run the spark job in a Hadoop cluster (meaning the resources of the Hadoop 
cluster will be leveraged) but call it from the K8S container. My question is 
whether this mode of implementation possible? Do let me know please.
Thanks,
Debu

Sent from my iPhone
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to