RE: Conda Python Env in K8S

2021-12-03 Thread Bode, Meikel, NMA-CFD
Hi Mich, sure thats possible. But distributing the complete env would be more practical. A workaround at the moment is, that we build different environments and store them in a pv and then we mount it into the pods and refer from the SparkApplication resource to the desired env.. But actually t

Re: Scala 3 support approach

2021-12-03 Thread Sean Owen
I don't think anyone's tested it or tried it, but if it's pretty compatible with 2.13, it may already work, or mostly. See my answer below, which still stands: if it's not pretty compatible with 2.13 and needs a new build, this effectively means dropping 2.12 support, as supporting 3 Scala version

Re: Conda Python Env in K8S

2021-12-03 Thread Mich Talebzadeh
Build python packages into the docker image itself first with pip install RUN pip install panda . . —no-cache HTH On Fri, 3 Dec 2021 at 11:58, Bode, Meikel, NMA-CFD < meikel.b...@bertelsmann.de> wrote: > Hello, > > > > I am trying to run spark jobs using Spark Kubernetes Operator. > > But when

Conda Python Env in K8S

2021-12-03 Thread Bode, Meikel, NMA-CFD
Hello, I am trying to run spark jobs using Spark Kubernetes Operator. But when I try to bundle a conda python environment using the following resource description the python interpreter is only unpack to the driver and not to the executors. apiVersion: "sparkoperator.k8s.io/v1beta2" kind: Spark