Hi Cley,

Thank you for taking the time to respond to my query. Your insights on
Spark cluster deployment are much appreciated.

However, I'd like to clarify that my specific challenge is related to
running the Spark Connect Server on Kubernetes in Cluster Mode. While I
understand the general deployment strategies for Spark on Kubernetes, I am
seeking guidance particularly on the Spark Connect Server aspect.

cf. Spark Connect Overview - Spark 3.4.1 Documentation
    https://spark.apache.org/docs/latest/spark-connect-overview.html

To reiterate, when I connect from an external Python client and execute
scripts, the server operates in Local Mode instead of the expected
Kubernetes Cluster Mode (with master as k8s://... and deploy-mode set to
cluster).

If I've misunderstood your initial response and it was indeed related to
Spark Connect, I sincerely apologize for the oversight. In that case, could
you please expand a bit on the Spark Connect-specific aspects?

Do you, or anyone else in the community, have experience with this specific
setup or encountered a similar issue with Spark Connect Server on
Kubernetes? Any targeted advice or guidance would be invaluable.

Thank you again for your time and help.

Best regards,
Yasukazu

2023年9月4日(月) 0:23 Cleyson Barros <euroc...@gmail.com>:

> Hi Nagatomi,
> Use Apache imagers, then run your master node, then start your many
> slavers. You can add a command line in the docker files to call for the
> master using the docker container names in your service composition if you
> wish to run 2 masters active and standby follow the instructions in the
> Apache docs to do this configuration, the recipe is the same except when
> you start the masters and how you expect the behaviour of your cluster.
> I hope it helps.
> Have a nice day :)
> Cley
>
> Nagatomi Yasukazu <yassan0...@gmail.com> escreveu no dia sábado,
> 2/09/2023 à(s) 15:37:
>
>> Hello Apache Spark community,
>>
>> I'm currently trying to run Spark Connect Server on Kubernetes in Cluster
>> Mode and facing some challenges. Any guidance or hints would be greatly
>> appreciated.
>>
>> ## Environment:
>> Apache Spark version: 3.4.1
>> Kubernetes version:  1.23
>> Command executed:
>>  /opt/spark/sbin/start-connect-server.sh \
>>    --packages
>> org.apache.spark:spark-connect_2.13:3.4.1,org.apache.iceberg:iceberg-spark-runtime-3.4_2.13:1.3.1...
>> Note that I'm running it with the environment variable
>> SPARK_NO_DAEMONIZE=1.
>>
>> ## Issue:
>> When I connect from an external Python client and run scripts, it
>> operates in Local Mode instead of the expected Cluster Mode.
>>
>> ## Expected Behavior:
>> When connecting from a Python client to the Spark Connect Server, I
>> expect it to run in Cluster Mode.
>>
>> If anyone has any insights, advice, or has faced a similar issue, I'd be
>> grateful for your feedback.
>> Thank you in advance.
>>
>>
>>

Reply via email to