hi Martijn:
Some platform users may not package all the jars to the fat jars, spark
also has - jars for dependencies
https://stackoverflow.com/questions/29099115/spark-submit-add-multiple-jars-in-classpath
On 2022/10/27 06:48:52 Martijn Visser wrote:
> Hi Jacky Lau,
>
> Since you've sent the email
Thanks Jacky Lau for starting this discussion.
I understand that you are trying to find a convenient way to specify
dependency jars along with user jar. However,
let's try to narrow down by differentiating deployment modes.
# Standalone mode
No matter you are using the standalone mode on virtual
Hi Jacky Lau,
Since you've sent the email to multiple mailing lists, I've decided to
reply to the one that you've sent to both the Dev and User ML.
> but it is not possible for platform users to create fat jars to package
all their dependencies into the final jar package
Can you elaborate on why
Hi guys:
I'd like to initiate a discussion about adding command-line arguments to
support user-dependent jar packages.
Currently flink supports user's main jars through -jarfile or without
setting this , the flink client will treat the first argument after that as
the user master jar package when
Hi guys:
I'd like to initiate a discussion about adding command-line arguments to
support user-dependent jar packages.
Currently flink supports user's main jars through -jarfile or without
setting this , the flink client will treat the first argument after that as
the user master jar package when
Hi guys:
I'd like to initiate a discussion about adding command-line arguments to
support user-dependent jar packages.
Currently flink supports user's main jars through -jarfile or without
setting this , the flink client will treat the first argument after that as
the user master jar package when