Re: spark connect server cannot set driver memory

2024-10-17 Thread Hill Liu
Maybe try --conf spark.driver.memory=4g On Tue, Oct 15, 2024 at 1:46 AM Yunhui Han wrote: > Hi, all > > I am using spark connect server on version 3.4.0 & 3.5.3. > > I can't set driver memory as follows, > > ${SPARK_HOME}/sbin/start-connect-server.sh \ > --packages org.apache.spark:spark-connect

Re: Inquiry on JDK 21 Stable Support Timeline for Apache Spark

2024-10-17 Thread Hill Liu
I found this jira for spark 4.0.0 release plan https://issues.apache.org/jira/browse/SPARK-44111 On Fri, Oct 18, 2024 at 2:04 AM tison c sunny chiriyankandath < tisoncsu...@gmail.com> wrote: > Dear Apache Spark Community, > > I hope this message finds you well. > > I’m currently working on a proj

Re: Spark Docker image with added packages

2024-10-17 Thread Ángel
Creating a custom classloader to load classes from those jars? El jue, 17 oct 2024, 19:47, Nimrod Ofek escribió: > > Hi, > > Thanks all for the replies. > > I am adding the Spark dev list as well - as I think this might be an issue > that needs to be addressed. > > The options presented here wil

Re: Spark Docker image with added packages

2024-10-17 Thread Damien Hawes
Hi, That's on you as the maintainer of the derived image to ensure that your added dependencies do not conflict with Spark's dependencies. Speaking from experience, there are several ways to achieve this: 1. Ensure you're using packages that contain shaded and relocated packages, if possible. 2.

Re: Spark Docker image with added packages

2024-10-17 Thread Nimrod Ofek
Hi, Thanks all for the replies. I am adding the Spark dev list as well - as I think this might be an issue that needs to be addressed. The options presented here will get the jars - but they don't help us with dependencies conflicts... For example - com.google.cloud.bigdataoss:gcs-connector:hado

Inquiry on JDK 21 Stable Support Timeline for Apache Spark

2024-10-17 Thread tison c sunny chiriyankandath
Dear Apache Spark Community, I hope this message finds you well. I’m currently working on a project that requires compatibility with JDK 21, and I have noticed the ongoing efforts to support JDK 21 in Apache Spark, particularly with the upcoming Spark 4.0.0 release. Could anyone provide an update