Hi All,
Is it possible to run a Spark Connect server in Kubernetes while
configuring it to communicate with Kubernetes as the cluster manager? If
so, is there any example?
Thanks
I've kubeflow spark-operator installed on K8s (GKE), and i'm running a
structured streaming job which reads data from kafka .. the job is run
every 10 mins.
It is giving an error shown below:
```
Traceback (most recent call last):
File "/opt/spark/custom-dir/main.py", line 356, in
sys.exi
Hello!
Thank you for looking into these issues! I'm happy that you identified the
root cause for OuterJoinTest and SqlSyntaxTest and working on the fix.
Regarding IntervalJoinTest, I think I understand your point. Thank you for
explaining that. However, this can be confusing to a user. Let's mayb