Re: [外部邮件] [DISCUSS] Move Spark Connect server to builtin package (Client API layer stays external)

2024-07-01 Thread yangjie01
I have manually attempted to only modify the `assembly/pom.xml` and examined the results of executing `dev/make-distribution.sh --tgz`. The `spark-connect_2.13-4.0.0-SNAPSHOT.jar` is indeed included in the jars directory. However, if rearranging the directories would result in a clearer project

Re: [外部邮件] [DISCUSS] Move Spark Connect server to builtin package (Client API layer stays external)

2024-07-01 Thread Hyukjin Kwon
My concern is that the `connector` directory is really for external/optional packages (and they aren't included in assembly IIRC).. so I am hesitant to just change the assembly. The actual changes are not quite large but it moves the files around. On Tue, 2 Jul 2024 at 12:23, yangjie01 wrote: >

Re: [外部邮件] [DISCUSS] Move Spark Connect server to builtin package (Client API layer stays external)

2024-07-01 Thread yangjie01
I'm supportive of this initiative. However, if the purpose is just to avoid the additional `--packages` option, it seems that making some adjustments to the `assembly/pom.xml` could potentially meet our goal. Is it really necessary to restructure the code directory? Jie Yang 发件人: Hyukjin Kwon

[DISCUSS] Move Spark Connect server to builtin package (Client API layer stays external)

2024-07-01 Thread Hyukjin Kwon
Hi all, I would like to discuss moving Spark Connect server to builtin package. Right now, users have to specify —packages when they run Spark Connect server script, for example: ./sbin/start-connect-server.sh --jars `ls connector/connect/server/target/**/spark-connect*SNAPSHOT.jar` or ./sbin/s