I think you could use Flink distributed cache to make the files available
on all
TaskManagers. For example,


*env.registerCachedFile(cacheFilePath, "cacheFile", false);*

and then, using the following code to get the registered file in the
operator.

*getRuntimeContext().getDistributedCache().getFile("cacheFile")*

This could work all deployment(standalone, YARN session/perjob).

Best,
Yang

魏旭斌 <a947199...@vip.qq.com> 于2020年4月22日周三 下午7:23写道:

> I want to make UDTF into a jar package,Then load the jar in the Main
> method of Job through dynamic loading and get the UDTF class.
> But in this way,flink does not automatically distribute Jar to
> tashManager,So it caused an error。
>
> I find that FlinkClient provides the -C
>
> Bringing -C with the function jar URL when running the job can achieve the
> effect I want. But I don't want to directly operate the Flink client
>
>
> On the other hand,The Apache Flink Dashboard did not provide the
> corresponding input parameters.and the REST API did not find the one that
> meets the requirements,too.
>
> Is there any good suggestion to distribute the function jar I submitted to
> all taskManager? thanks.
>
>

Reply via email to