Well, I have tried almost everything the last 2 days now.
There is no user spark, and whatever I do with the executor image it only runs
for 2 minutes in k8s and then restarts.
The problem seems to be the nogroup that is writing files from executors.
drwxr-xr-x 2185 nogroup4096 Sep
Hi and thanks for all the good help.
I will build jupyter on top of spark to be able to run jupyter in local mode
with the new koalas library. The new koalas library can be imported as "from
pyspark import pandas as ps".
Then you can run spark on K8S the same way that you use pandas in a not
ok, so when I use spark on k8s I can only save files to s3 buckets or to a
database?
Note my setup, its spark with jupyterlab on top on k8s.
What are those for if I cant write files from spark in k8s to disk?
"spark.kubernetes.driver.volumes.persistentVolumeClaim.nfs100.mount.readOnly",
"Fa
Hi, I have built and running spark on k8s. A link to my repo
https://github.com/bjornjorgensen/jlpyk8s
Everything seems to be running fine, but I can’t save to PVC.
If I convert the dataframe to pandas, then I can save it.
from pyspark.sql import SparkSession
spark = SparkSession.builder \