I'm supportive of this initiative. However, if the purpose is just to avoid the 
additional `--packages` option, it seems that making some adjustments to the 
`assembly/pom.xml` could potentially meet our goal. Is it really necessary to 
restructure the code directory?

Jie Yang

发件人: Hyukjin Kwon <gurwls...@apache.org>
日期: 2024年7月2日 星期二 08:19
收件人: dev <dev@spark.apache.org>
主题: [外部邮件] [DISCUSS] Move Spark Connect server to builtin package (Client API 
layer stays external)


Hi all,

I would like to discuss moving Spark Connect server to builtin package. Right 
now, users have to specify —packages when they run Spark Connect server script, 
for example:

./sbin/start-connect-server.sh --jars `ls 
connector/connect/server/target/**/spark-connect*SNAPSHOT.jar`

or

./sbin/start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.1

which is a little bit odd that sbin scripts should provide jars to start.

Moving it to builtin package is pretty straightforward because most of jars are 
shaded, and the impact would be minimal, I have a prototype here 
apache/spark/#47157<https://mailshield.baidu.com/check?q=%2fcy8z7%2fYZNJHeBln9msQEVrqe7rk5HGMbpa7h9lbvgUO6rNqy1kPDW2sYiQgCGHS>.
 This also simplifies Python local running logic a lot.

User facing API layer, Spark Connect Client, stays external but I would like 
the internal/admin server layer, Spark Connect Server, implementation to be 
built in Spark.

Please let me know if you have thoughts on this!

Reply via email to