Hi,

but how can I change/configure it per submitted job and not for the whole
cluster?

Best,
Georg

Am Do., 25. Juni 2020 um 10:07 Uhr schrieb Arvid Heise <ar...@ververica.com
>:

> Hi Georg,
>
> thank you for your detailed explanation. You want to use env.java.opts[1].
> There are flavors if you only want to make it available on job manager or
> task manager but I guess the basic form is good enough for you.
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-stable/ops/config.html#jvm-and-logging-options
>
> On Wed, Jun 24, 2020 at 10:52 PM Georg Heiler <georg.kf.hei...@gmail.com>
> wrote:
>
>> Hi Arvid,
>>
>> thanks for the quick reply. I have a strong Apache spark background.
>> There, when executing on YARN or locally usually, the cluster is created
>> on-demand for the duration of the batch /streaming job.
>> There, there is only the concept of A) master/driver (application master)
>> B) slave/executor C) Driver: the node where the main class is invoked. In
>> Sparks`notion, I want the -D parameter to be available on the (C) Driver
>> node. When translating this to Flink, I want this to be available to the
>> Main class which is invoked when the job is submitted/started by the job
>> manager (which should be equivalent to the driver).
>>
>> But maybe my understanding of Flink is not 100% correct yet.
>>
>> Unfortunately, using -D directly is not working.
>>
>> Best,
>> Georg
>>
>> Am Mi., 24. Juni 2020 um 22:13 Uhr schrieb Arvid Heise <
>> ar...@ververica.com>:
>>
>>> Hi Georg,
>>>
>>> could you check if simply using -D is working as described here [1].
>>>
>>> If not, could you please be more precise: do you want the parameter to
>>> be passed to the driver, the job manager, or the task managers?
>>>
>>> [1]
>>> https://ci.apache.org/projects/flink/flink-docs-master/ops/cli.html#deployment-targets
>>>
>>> On Wed, Jun 24, 2020 at 8:55 PM Georg Heiler <georg.kf.hei...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> how can I pass additional configuration parameters like spark`s
>>>> extraJavaOptions to a flink job?
>>>>
>>>>
>>>> https://stackoverflow.com/questions/62562153/apache-flink-and-pureconfig-passing-java-properties-on-job-startup
>>>>
>>>> contains the details. But the gist is:
>>>> flink run --class
>>>> com.github.geoheil.streamingreference.tweets.TweetsAnalysis \
>>>> "usecases/tweets/build/libs/tweets_${SCALA_VERSION}-${VERSION}-all.jar"
>>>> \
>>>> -yD env.java.opts="-Dconfig.file='config/jobs/twitter-analysis.conf'"
>>>>
>>>> is not passing the -Dconfig.file to the flink job!
>>>>
>>>> Best,
>>>> Georg
>>>>
>>>
>>>
>>> --
>>>
>>> Arvid Heise | Senior Java Developer
>>>
>>> <https://www.ververica.com/>
>>>
>>> Follow us @VervericaData
>>>
>>> --
>>>
>>> Join Flink Forward <https://flink-forward.org/> - The Apache Flink
>>> Conference
>>>
>>> Stream Processing | Event Driven | Real Time
>>>
>>> --
>>>
>>> Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany
>>>
>>> --
>>> Ververica GmbH
>>> Registered at Amtsgericht Charlottenburg: HRB 158244 B
>>> Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji
>>> (Toni) Cheng
>>>
>>
>
> --
>
> Arvid Heise | Senior Java Developer
>
> <https://www.ververica.com/>
>
> Follow us @VervericaData
>
> --
>
> Join Flink Forward <https://flink-forward.org/> - The Apache Flink
> Conference
>
> Stream Processing | Event Driven | Real Time
>
> --
>
> Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany
>
> --
> Ververica GmbH
> Registered at Amtsgericht Charlottenburg: HRB 158244 B
> Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji
> (Toni) Cheng
>

Reply via email to