Hii team,
I am working on spark on kubernetes and was working on a scenario where i
need to use spark on kubernetes in client mode from jupyter notebook from
two different kubernetes cluster . Is it possible in client mode to spin up
driver in one k8 cluster and executors in another k8 cluster .
Oh, thanks for mentioning that, it looks l dynamic allocation on Kubernetes
works in client mode in Spark 3.0.0. I just had to set the following
configurations:
spark.dynamicAllocation.enabled=true
spark.dynamicAllocation.shuffleTracking.enabled=true
to enable dynamic allocation and disable the
Hey guys i was able to run dynamic scaling in both cluster and client mode
. would document and send it over this weekend
On Tue 12 May, 2020, 1:26 PM Roland Johann,
wrote:
> Hi all,
>
> don’t want to interrupt the conversation but are keen where I can find
> information regarding dynamic alloca
Hi all,
don’t want to interrupt the conversation but are keen where I can find
information regarding dynamic allocation on kubernetes. As far as I know the
docs just point to future work.
Thanks a lot,
Roland
> Am 12.05.2020 um 09:25 schrieb Steven Stetzler :
>
> Hi all,
>
> I am intereste
Hi all,
I am interested in this as well. My use-case could benefit from dynamic
executor scaling but we are restricted to using client mode since we are
only using Spark shells.
Could anyone help me understand the barriers to getting dynamic executor
scaling to work in client mode on Kubernetes?
Hiii ,
The dynamic executor scalling is working fine for spark on kubernetes
(latest from spark master repository ) in cluster mode . is the dynamic
executor scalling available for client mode ? if yes where can i find the
usage doc for same .
If no is there any PR open for this ?
Thanks ,
Pradee