Hello Yang,

I have attached my pom file and I did not see that I am using any Hadoop
dependency. Can you please help me.

On Wed, May 6, 2020 at 1:22 PM Yang Wang <danrtsey...@gmail.com> wrote:

> Hi aj,
>
> From the logs you have provided, the hadoop version is still 2.4.1.
> Could you check the user jar(i.e. events-processor-1.0-SNAPSHOT.jar) have
> some
> hadoop classes? If it is, you need to exclude the hadoop dependency.
>
>
> Best,
> Yang
>
> aj <ajainje...@gmail.com> 于2020年5月6日周三 下午3:38写道:
>
>> Hello,
>>
>> Please help me upgrade to 1.10 in AWS EMR.
>>
>> On Fri, May 1, 2020 at 4:05 PM aj <ajainje...@gmail.com> wrote:
>>
>>> Hi Yang,
>>>
>>> I am attaching the logs for your reference, please help me what i am
>>> doing wrong.
>>>
>>> Thanks,
>>> Anuj
>>>
>>> On Wed, Apr 29, 2020 at 9:06 AM Yang Wang <danrtsey...@gmail.com> wrote:
>>>
>>>> Hi Anuj,
>>>>
>>>> I think the exception you come across still because the hadoop version
>>>> is 2.4.1. I have checked the hadoop code, the code line are exactly
>>>> same.
>>>> For 2.8.1, i also have checked the ruleParse. It could work.
>>>>
>>>> /**
>>>>  * A pattern for parsing a auth_to_local rule.
>>>>  */
>>>> private static final Pattern ruleParser =
>>>>   
>>>> Pattern.compile("\\s*((DEFAULT)|(RULE:\\[(\\d*):([^\\]]*)](\\(([^)]*)\\))?"+
>>>>                   "(s/([^/]*)/([^/]*)/(g)?)?))/?(L)?");
>>>>
>>>>
>>>> Could you share the jobmanager logs so that i could check the classpath
>>>> and hadoop version?
>>>>
>>>> Best,
>>>> Yang
>>>>
>>>> aj <ajainje...@gmail.com> 于2020年4月28日周二 上午1:01写道:
>>>>
>>>>> Hello Yang,
>>>>> My Hadoop version is Hadoop 3.2.1-amzn-0
>>>>> and I have put in flink/lib.
>>>>>  flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
>>>>>
>>>>> then I am getting below error :
>>>>>
>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/mnt/yarn/usercache/hadoop/appcache/application_1587983834922_0002/filecache/10/slf4j-log4j12-1.7.15.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: Found binding in
>>>>> [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>> explanation.
>>>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>>>> Exception in thread "main" java.lang.IllegalArgumentException: Invalid
>>>>> rule: /L
>>>>>       RULE:[2:$1@$0](.*@)s/@.*///L
>>>>>       DEFAULT
>>>>>         at
>>>>> org.apache.hadoop.security.authentication.util.KerberosName.parseRules(KerberosName.java:321)
>>>>>         at
>>>>> org.apache.hadoop.security.authentication.util.KerberosName.setRules(KerberosName.java:386)
>>>>>         at
>>>>> org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:75)
>>>>>         at
>>>>> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:247)
>>>>>         at
>>>>> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:232)
>>>>>         at
>>>>> org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:718)
>>>>>         at
>>>>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:703)
>>>>>         at
>>>>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:605)
>>>>>         at
>>>>> org.apache.flink.yarn.entrypoint.YarnEntrypointUtils.logYarnEnvironmentInformation(YarnEntrypointUtils.java:136)
>>>>>         at
>>>>> org.apache.flink.yarn.entrypoint.YarnJobClusterEntrypoint.main(YarnJobClusterEntrypoint.java:109)
>>>>>
>>>>>
>>>>> if I remove the  flink-shaded-hadoop-2-uber-2.8.3-10.0.jar  from lib
>>>>> then i get below error:
>>>>>
>>>>> 2020-04-27 16:59:37,293 INFO  org.apache.flink.client.cli.CliFrontend
>>>>>                       -  Classpath:
>>>>> /usr/lib/flink/lib/flink-table-blink_2.11-1.10.0.jar:/usr/lib/flink/lib/flink-table_2.11-1.10.0.jar:/usr/lib/flink/lib/log4j-1.2.17.jar:/usr/lib/flink/lib/slf4j-log4j12-1.7.15.jar:/usr/lib/flink/lib/flink-dist_2.11-1.10.0.jar::/etc/hadoop/conf:/etc/hadoop/conf
>>>>> 2020-04-27 16:59:37,293 INFO  org.apache.flink.client.cli.CliFrontend
>>>>>                       -
>>>>> --------------------------------------------------------------------------------
>>>>> 2020-04-27 16:59:37,300 INFO
>>>>>  org.apache.flink.configuration.GlobalConfiguration            - Loading
>>>>> configuration property: jobmanager.heap.size, 1024m
>>>>> 2020-04-27 16:59:37,300 INFO
>>>>>  org.apache.flink.configuration.GlobalConfiguration            - Loading
>>>>> configuration property: taskmanager.memory.process.size, 1568m
>>>>> 2020-04-27 16:59:37,300 INFO
>>>>>  org.apache.flink.configuration.GlobalConfiguration            - Loading
>>>>> configuration property: taskmanager.numberOfTaskSlots, 1
>>>>> 2020-04-27 16:59:37,300 INFO
>>>>>  org.apache.flink.configuration.GlobalConfiguration            - Loading
>>>>> configuration property: parallelism.default, 1
>>>>> 2020-04-27 16:59:37,300 INFO
>>>>>  org.apache.flink.configuration.GlobalConfiguration            - Loading
>>>>> configuration property: env.yarn.conf.dir, /etc/hadoop/conf
>>>>> 2020-04-27 16:59:37,300 INFO
>>>>>  org.apache.flink.configuration.GlobalConfiguration            - Loading
>>>>> configuration property: env.hadoop.conf.dir, /etc/hadoop/conf
>>>>> 2020-04-27 16:59:37,301 INFO
>>>>>  org.apache.flink.configuration.GlobalConfiguration            - Loading
>>>>> configuration property: jobmanager.execution.failover-strategy, region
>>>>> 2020-04-27 16:59:37,301 INFO
>>>>>  org.apache.flink.configuration.GlobalConfiguration            - Loading
>>>>> configuration property: classloader.resolve-order, parent-first
>>>>> 2020-04-27 16:59:37,301 INFO
>>>>>  org.apache.flink.configuration.GlobalConfiguration            - Loading
>>>>> configuration property: s3.access-key, AKIA52DD5QA5FC7HPKXG
>>>>> 2020-04-27 16:59:37,301 INFO
>>>>>  org.apache.flink.configuration.GlobalConfiguration            - Loading
>>>>> configuration property: s3.secret-key, ******
>>>>> 2020-04-27 16:59:37,305 WARN  org.apache.flink.client.cli.CliFrontend
>>>>>                       - Could not load CLI class
>>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.
>>>>> java.lang.NoClassDefFoundError:
>>>>> org/apache/hadoop/yarn/exceptions/YarnException
>>>>>         at java.lang.Class.forName0(Native Method)
>>>>>         at java.lang.Class.forName(Class.java:264)
>>>>>         at
>>>>> org.apache.flink.client.cli.CliFrontend.loadCustomCommandLine(CliFrontend.java:1076)
>>>>>         at
>>>>> org.apache.flink.client.cli.CliFrontend.loadCustomCommandLines(CliFrontend.java:1030)
>>>>>         at
>>>>> org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:957)
>>>>> Caused by: java.lang.ClassNotFoundException:
>>>>> org.apache.hadoop.yarn.exceptions.YarnException
>>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
>>>>>         at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
>>>>>         ... 5 more
>>>>> 2020-04-27 16:59:37,406 INFO  org.apache.flink.core.fs.FileSystem
>>>>>                       - Hadoop is not in the classpath/dependencies. The
>>>>> extended set of supported File Systems via Hadoop is not available.
>>>>> 2020-04-27 16:59:37,458 INFO
>>>>>  org.apache.flink.runtime.security.modules.HadoopModuleFactory  - Cannot
>>>>> create Hadoop Security Module because Hadoop cannot be found in the
>>>>> Classpath.
>>>>> 2020-04-27 16:59:37,476 INFO
>>>>>  org.apache.flink.runtime.security.modules.JaasModule          - Jaas file
>>>>> will be created as /tmp/jaas-7054453135321774613.conf.
>>>>> 2020-04-27 16:59:37,480 INFO
>>>>>  org.apache.flink.runtime.security.SecurityUtils               - Cannot
>>>>> install HadoopSecurityContext because Hadoop cannot be found in the
>>>>> Classpath.
>>>>> 2020-04-27 16:59:37,481 INFO  org.apache.flink.client.cli.CliFrontend
>>>>>                       - Running 'run' command.
>>>>> 2020-04-27 16:59:37,488 INFO  org.apache.flink.client.cli.CliFrontend
>>>>>                       - Building program from JAR file
>>>>> 2020-04-27 16:59:37,488 ERROR org.apache.flink.client.cli.CliFrontend
>>>>>                       - Invalid command line arguments.
>>>>> org.apache.flink.client.cli.CliArgsException: Could not build the
>>>>> program from JAR file.
>>>>>         at
>>>>> org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:203)
>>>>>         at
>>>>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:895)
>>>>>         at
>>>>> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:968)
>>>>>         at
>>>>> org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
>>>>>         at
>>>>> org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:968)
>>>>> Caused by: java.io.FileNotFoundException: JAR file does not exist: -ynm
>>>>>         at
>>>>> org.apache.flink.client.cli.CliFrontend.getJarFile(CliFrontend.java:719)
>>>>>         at
>>>>> org.apache.flink.client.cli.CliFrontend.buildProgram(CliFrontend.java:695)
>>>>>         at
>>>>> org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:200)
>>>>>         ... 4 more
>>>>>
>>>>>
>>>>> Can you please help.
>>>>>
>>>>> Thanks,
>>>>> Anuj
>>>>>
>>>>>
>>>>> On Mon, Apr 13, 2020 at 7:43 AM Yang Wang <danrtsey...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi Anuj,
>>>>>>
>>>>>> It seems that you are using hadoop version 2.4.1. I think "L" could
>>>>>> not be supported in
>>>>>> this version. Could you upgrade your hadoop version to 2.8 and have a
>>>>>> try? If your
>>>>>> YARN cluster version is 2.8+, then you could directly remove the
>>>>>> flink-shaded-hadoop
>>>>>> in your lib directory. Otherwise, you need to download the
>>>>>> flink-shaded-hadoop with
>>>>>> version 2.8 here[1].
>>>>>>
>>>>>>
>>>>>> [1]. https://flink.apache.org/downloads.html#additional-components
>>>>>>
>>>>>> Best,
>>>>>> Yang
>>>>>>
>>>>>> aj <ajainje...@gmail.com> 于2020年4月11日周六 上午4:21写道:
>>>>>>
>>>>>>> Hi Robert,
>>>>>>> attached the full application log file.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Anuj
>>>>>>> <http://www.cse.iitm.ac.in/%7Eanujjain/>
>>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>> Thanks & Regards,
>>>>> Anuj Jain
>>>>> Mob. : +91- 8588817877
>>>>> Skype : anuj.jain07
>>>>> <http://www.oracle.com/>
>>>>>
>>>>>
>>>>> <http://www.cse.iitm.ac.in/%7Eanujjain/>
>>>>>
>>>>
>>>
>>> --
>>> Thanks & Regards,
>>> Anuj Jain
>>> Mob. : +91- 8588817877
>>> Skype : anuj.jain07
>>> <http://www.oracle.com/>
>>>
>>>
>>> <http://www.cse.iitm.ac.in/%7Eanujjain/>
>>>
>>
>>
>> --
>> Thanks & Regards,
>> Anuj Jain
>> Mob. : +91- 8588817877
>> Skype : anuj.jain07
>> <http://www.oracle.com/>
>>
>>
>> <http://www.cse.iitm.ac.in/%7Eanujjain/>
>>
>

-- 
Thanks & Regards,
Anuj Jain
Mob. : +91- 8588817877
Skype : anuj.jain07
<http://www.oracle.com/>


<http://www.cse.iitm.ac.in/%7Eanujjain/>
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0";
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd";>
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>event-stream-processor</artifactId>
    <packaging>pom</packaging>
    <version>1.0-SNAPSHOT</version>
    <modules>
        <module>events-processor</module>
        <module>aggregation-lib</module>
    </modules>
    <properties>
        <kafka.version>2.3.0</kafka.version>
        <confluent.version>5.3.2</confluent.version>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <project.build.outputEncoding>UTF-8</project.build.outputEncoding>
        <maven.compiler>2.0.2</maven.compiler>
        <avro.version>1.9.1</avro.version>
        <maven-shade-plugin.version>2.3</maven-shade-plugin.version>
        <flink.version>1.10.0</flink.version>
        <flink.format.parquet.version>1.11.0</flink.format.parquet.version>
        <hadoop.version>2.4.1</hadoop.version>
        <scala.binary.version>2.12</scala.binary.version>
    </properties>

    <repositories>
        <repository>
            <id>confluent</id>
            <url>http://packages.confluent.io/maven/</url>
        </repository>
    </repositories>

    <dependencies>
        <!-- Flink dependencies -->

        <dependency>
            <groupId>io.confluent</groupId>
            <artifactId>kafka-avro-serializer</artifactId>
            <version>5.3.0</version>
            <!--<exclusions>
                <exclusion>
                    <groupId>org.codehaus.jackson</groupId>
                    <artifactId>jackson-core-asl</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.codehaus.jackson</groupId>
                    <artifactId>jackson-mapper-asl</artifactId>
                </exclusion>
            </exclusions>-->
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>${flink.format.parquet.version}</version>
        </dependency>

        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>1.7.7</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.17</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-parquet_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>joda-time</groupId>
            <artifactId>joda-time</artifactId>
            <version>2.10.5</version>
        </dependency>

        <dependency>
            <groupId>com.fasterxml.jackson.dataformat</groupId>
            <artifactId>jackson-dataformat-yaml</artifactId>
            <version>2.10.2</version>
            <exclusions>
                <exclusion>
                    <groupId>com.fasterxml.jackson.core</groupId>
                    <artifactId>jackson-core</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.esotericsoftware.yamlbeans/yamlbeans -->
        <dependency>
            <groupId>com.esotericsoftware.yamlbeans</groupId>
            <artifactId>yamlbeans</artifactId>
            <version>1.13</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.uber/h3 -->
        <dependency>
            <groupId>com.uber</groupId>
            <artifactId>h3</artifactId>
            <version>3.6.3</version>
        </dependency>

        <dependency>
            <groupId>com.github.davidmoten</groupId>
            <artifactId>geo</artifactId>
            <version>0.7.7</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-elasticsearch6 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-elasticsearch7_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>tech.allegro.schema.json2avro</groupId>
            <artifactId>converter</artifactId>
            <version>0.2.9</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-statebackend-rocksdb_2.12</artifactId>
            <version>1.10.0</version>
            <scope>provided</scope>
        </dependency>

        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.13</version>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro-confluent-registry</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-elasticsearch7_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-bundle</artifactId>
            <version>1.11.762</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.bahir/flink-connector-redis -->
        <dependency>
            <groupId>org.apache.bahir</groupId>
            <artifactId>flink-connector-redis_2.11</artifactId>
            <version>1.0</version>
        </dependency>

        <dependency>
            <groupId>com.redislabs</groupId>
            <artifactId>jredisearch</artifactId>
            <version>1.6.0</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/redis.clients/jedis -->
        <dependency>
            <groupId>redis.clients</groupId>
            <artifactId>jedis</artifactId>
            <version>3.2.0</version>
        </dependency>


    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.6.0</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>3.1.1</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <artifactSet>
                                <excludes>
                                    <exclude>com.google.code.findbugs:jsr305</exclude>
                                    <exclude>org.slf4j:*</exclude>
                                    <exclude>log4j:*</exclude>
                                </excludes>
                            </artifactSet>
                            <filters>
                                <filter>
                                    <!-- Do not copy the signatures in the META-INF folder.
                                    Otherwise, this might cause SecurityExceptions when using the JAR. -->
                                    <artifact>*:*</artifact>
                                    <excludes>
                                        <exclude>META-INF/*.SF</exclude>
                                        <exclude>META-INF/*.DSA</exclude>
                                        <exclude>META-INF/*.RSA</exclude>
                                    </excludes>
                                </filter>
                            </filters>
                            <transformers>
                                <transformer
                                        implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                    <mainClass>my.programs.main.clazz</mainClass>
                                </transformer>
                            </transformers>
                        </configuration>
                    </execution>
                </executions>
            </plugin>

        </plugins>
    </build>
</project>

Reply via email to