Hi John,

ParameterTools is just a utility to help user to handle arguments.
I guess you are using ParameterTools in main method. If it is, it should be
in client log file, like Yang said, it's under "{FLINK_HOME}/log".

> Do I check someConfig for what ever requirement and just throw an
exception before starting the job or should I do System.exit();

I"m not sure what you exactly want.
Throwing an exception or System.exit would both fail the job (it depends on
where you codes are). However invoking System.exit is not always a good
practice.

Thanks,
Biao /'bɪ.aʊ/



On Fri, 17 Jan 2020 at 04:59, John Smith <java.dev....@gmail.com> wrote:

> Sorry I should have specified how to handle job specific config parameters
> using ParameterTool
>
> ParameterTool parameters = ...
>
> String someConfig = parameters.get("some.config"); <--- This is mandatory
>
>
> Do I check someConfig for what ever requirement and just throw an
> exception before starting the job or should I do System.exit(); Log it...
> Where does the log if I log it?
>
> On Wed, 15 Jan 2020 at 22:21, Yang Wang <danrtsey...@gmail.com> wrote:
>
>> Hi John,
>>
>> Most of the config options will have default values. However, you still
>> need to specify some
>> required fields. For example, the taskmanager resource related options.
>> If you do not specify
>> anyone, the exception will be thrown on the client side like following.
>>
>> Exception in thread "main"
>> org.apache.flink.configuration.IllegalConfigurationException: Either Task
>> Heap Memory size (taskmanager.memory.task.heap.size) and Managed Memory
>> size (taskmanager.memory.managed.size), or Total Flink Memory size
>> (taskmanager.memory.flink.size), or Total Process Memory size
>> (taskmanager.memory.process.size) need to be configured explicitly.
>> at
>> org.apache.flink.runtime.clusterframework.TaskExecutorResourceUtils.resourceSpecFromConfig(TaskExecutorResourceUtils.java:149)
>> at
>> org.apache.flink.runtime.util.BashJavaUtils.getTmResourceJvmParams(BashJavaUtils.java:62)
>> at org.apache.flink.runtime.util.BashJavaUtils.main(BashJavaUtils.java:46)
>>
>>
>> Also when you deploy Flink on Yarn cluster, it will check the queue
>> configuration, resource, etc.
>> If some config exception throws during startup, the Flink client will
>> fail and print the exception on
>> the console and client logs(usually in the {FLINK_HOME}/logs directory).
>>
>> However, not all the config options could be checked on the client side.
>> For example, If you set a
>> wrong checkpoint path, then you need to find the exceptions or errors in
>> the jobmanager logs.
>>
>>
>>
>> Best,
>> Yang
>>
>> John Smith <java.dev....@gmail.com> 于2020年1月16日周四 上午12:38写道:
>>
>>> Hi, so I have no problem reading config from resources files or anything
>>> like that...
>>>
>>> But my question is around how do we handle mandatory fields?
>>>
>>> 1- If a mandatory field is missing during startup... Do we just "log" it
>>> and do System.exit()?
>>> 2- If we do log it where does the log end up, the task or the job node?
>>>
>>

Reply via email to