In Flink 1.11, there were some changes how the Flink clients dependency is
bundled in [1]. The error you're seeing is likely due to the flink-clients
module not being on the classpath anymore. Can you check your dependencies
and update the pom.xml as suggested in [1]?

Matthias

[1] https://flink.apache.org/news/2020/12/18/release-1.11.3.html

On Tue, May 4, 2021 at 1:00 PM Ragini Manjaiah <ragini.manja...@gmail.com>
wrote:

>
> As you suggested I downloaded  flink 1.11.3
> to submit a flink job  . The actual application is developed in flink
> 1.8.1.
> Since the Hadoop cluster is 3.2.0 apache I downloaded flink 1.11.3 (
> flink-1.11.3-bin-scala_2.11.tgz) and tried  to submit the job.
> while submitting facing the below mentioned  exception . I have set the
> HADOOP parameters :
>
>
> export HADOOP_CONF_DIR=/etc/hadoop/conf
>
> export HADOOP_CLASSPATH=`hadoop classpath`
>
>
> Is there any changes I need to do it the pom file to overcome this
>
>
> org.apache.flink.client.program.ProgramInvocationException: The main
> method caused an error: No ExecutorFactory found to execute the application.
>
> at
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:302)
>
> at
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198)
>
> at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149)
>
> at
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:699)
>
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:232)
>
> at
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:916)
>
> at
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:992)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:422)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>
> at
> org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>
> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:992)
>
> Caused by: java.lang.IllegalStateException: No ExecutorFactory found to
> execute the application.
>
> at
> org.apache.flink.core.execution.DefaultExecutorServiceLoader.getExecutorFactory(DefaultExecutorServiceLoader.java:84)
>
> at
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:1809)
>
> at
> org.apache.flink.client.program.StreamContextEnvironment.executeAsync(StreamContextEnvironment.java:128)
>
> at
> org.apache.flink.client.program.StreamContextEnvironment.execute(StreamContextEnvironment.java:76)
>
> at
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1700)
>
> at org.sapphire.appspayload.StreamingJob.main(StreamingJob.java:214)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:498)
>
> at
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288)
>
> On Tue, May 4, 2021 at 11:47 AM Ragini Manjaiah <ragini.manja...@gmail.com>
> wrote:
>
>> Thank you for the clarification.
>>
>> On Mon, May 3, 2021 at 6:57 PM Matthias Pohl <matth...@ververica.com>
>> wrote:
>>
>>> Hi Ragini,
>>> this is a dependency version issue. Flink 1.8.x does not support Hadoop
>>> 3, yet. The support for Apache Hadoop 3.x was added in Flink 1.11 [1]
>>> through FLINK-11086 [2]. You would need to upgrade to a more recent Flink
>>> version.
>>>
>>> Best,
>>> Matthias
>>>
>>> [1]
>>> https://flink.apache.org/news/2020/07/06/release-1.11.0.html#important-changes
>>> [2] https://issues.apache.org/jira/browse/FLINK-11086
>>>
>>> On Mon, May 3, 2021 at 3:05 PM Ragini Manjaiah <
>>> ragini.manja...@gmail.com> wrote:
>>>
>>>> Hi Team,
>>>> I have Flink 1.8.1 and  hadoop open source 3.2.0 . My flink jobs run
>>>> without issues on HDP 2.5.3 version. when run on hadoop open source 3.2.0
>>>> encountering the below mentioned exception .
>>>> I have set hadoop
>>>> export HADOOP_CONF_DIR=/etc/hadoop/conf
>>>> export HADOOP_CLASSPATH=`hadoop classpath`
>>>>
>>>>
>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>
>>>> SLF4J: Found binding in
>>>> [jar:file:/home_dir/svsap61/flink-1.8.1/lib/slf4j-log4j12-1.7.15.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>
>>>> SLF4J: Found binding in
>>>> [jar:file:/usr/share/hadoop-tgt-3.2.0.1.0.0.11/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>
>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>> explanation.
>>>>
>>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>>>
>>>> java.lang.IllegalAccessError: tried to access method
>>>> org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider.getProxyInternal()Ljava/lang/Object;
>>>> from class
>>>> org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider.init(RequestHedgingRMFailoverProxyProvider.java:75)
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.RMProxy.createRMFailoverProxyProvider(RMProxy.java:188)
>>>>
>>>> at org.apache.hadoop.yarn.client.RMProxy.createRMProxy(RMProxy.java:118)
>>>>
>>>> at org.apache.hadoop.yarn.client.RMProxy.createRMProxy(RMProxy.java:93)
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.ClientRMProxy.createRMProxy(ClientRMProxy.java:72)
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceStart(YarnClientImpl.java:195)
>>>>
>>>> at
>>>> org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
>>>>
>>>> at
>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:1013)
>>>>
>>>> at
>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:274)
>>>>
>>>> at
>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:454)
>>>>
>>>> at
>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:97)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:224)
>>>>
>>>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
>>>>
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>
>>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>>>
>>>> at
>>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>>
>>>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
>>>>
>>>

Reply via email to