Can you share the full stack trace, not just a part of it?

On Thu, Sep 9, 2021 at 1:43 PM Harshvardhan Shinde <
harshvardhan.shi...@oyorooms.com> wrote:

> Hi,
>
> I added the dependencies while trying to resolve the same issue, thought I
> was missing them.
>
> Thanks
>
> On Thu, Sep 9, 2021 at 4:26 PM Robert Metzger <rmetz...@apache.org> wrote:
>
>> Hey,
>>
>> Why do you have these dependencies in your pom?
>>
>>         <!-- 
>> https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
>>         <dependency>
>>             <groupId>org.apache.kafka</groupId>
>>             <artifactId>kafka-clients</artifactId>
>>             <version>2.8.0</version>
>>         </dependency>
>>
>>         <dependency>
>>             <groupId>org.apache.kafka</groupId>
>>             <artifactId>kafka_2.12</artifactId>
>>             <version>2.8.0</version>
>>         </dependency>
>>
>>
>> They are not needed for using the Kafka connector of Flink (the flink
>> kafka connector dependencies pulls the required dependencies)
>>
>>
>> On Thu, Sep 9, 2021 at 12:02 PM Harshvardhan Shinde <
>> harshvardhan.shi...@oyorooms.com> wrote:
>>
>>> Hi,
>>>
>>> I'm trying a simple flink job that reads data from a kafka topic and
>>> creates a Hive table.
>>>
>>> I'm following the steps from here
>>> <https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/connectors/table/hive/overview/#connecting-to-hive>
>>> .
>>>
>>> Here's my code:
>>>
>>> import org.apache.flink.table.api.EnvironmentSettings;
>>> import org.apache.flink.table.api.Table;
>>> import org.apache.flink.table.api.TableEnvironment;
>>> import org.apache.flink.table.catalog.hive.HiveCatalog;
>>>
>>> EnvironmentSettings settings = 
>>> EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
>>> TableEnvironment tableEnv = TableEnvironment.create(settings);
>>>
>>> String name            = "myhive";
>>> String defaultDatabase = "harsh_test";
>>> String hiveConfDir     = "/etc/hive/conf";
>>>
>>> HiveCatalog hive = new HiveCatalog(name, defaultDatabase, hiveConfDir);
>>> tableEnv.registerCatalog(name, hive);
>>>
>>> // set the HiveCatalog as the current catalog of the session
>>> tableEnv.useCatalog(name);
>>>
>>> tableEnv.executeSql("CREATE TABLE IF NOT EXISTS transactions (\n" +
>>>       "  `created_at` TIMESTAMP(3) METADATA FROM 'timestamp',\n" +
>>>       "      `partition` BIGINT METADATA VIRTUAL,\n" +
>>>       "      `offset` BIGINT METADATA VIRTUAL,\n" +
>>>       "    account_id  BIGINT,\n" +
>>>       "    amount      BIGINT,\n" +
>>>       "    transaction_time TIMESTAMP(3),\n" +
>>>       "    WATERMARK FOR transaction_time AS transaction_time - INTERVAL 
>>> '5' SECOND\n" +
>>>       ") WITH (\n" +
>>>       "    'connector' = 'kafka',\n" +
>>>       "    'topic'     = 'flink-stream-table',\n" +
>>>       "    'properties.bootstrap.servers' = '<BROKER_ADDRESS>:9092',\n" +
>>>       "   'scan.startup.mode' = 'earliest-offset',\n" +
>>>       "    'format'    = 'csv'\n" +
>>>       ")");
>>>
>>> Table table = tableEnv.sqlQuery("Select * from transactions");
>>> table.execute().print();
>>>
>>> The code builds successfully, but I'm getting the following runtime
>>> error:
>>>
>>> Caused by: java.util.concurrent.CompletionException:
>>> java.lang.NoClassDefFoundError:
>>> org/apache/kafka/common/serialization/ByteArrayDeserializer at
>>> java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273)
>>> at
>>> java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280)
>>> at
>>> java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1606)
>>> ..
>>>
>>> Here are my pom.xml file contents:
>>>
>>> <!--
>>> Licensed to the Apache Software Foundation (ASF) under one
>>> or more contributor license agreements.  See the NOTICE file
>>> distributed with this work for additional information
>>> regarding copyright ownership.  The ASF licenses this file
>>> to you under the Apache License, Version 2.0 (the
>>> "License"); you may not use this file except in compliance
>>> with the License.  You may obtain a copy of the License at
>>>
>>>   http://www.apache.org/licenses/LICENSE-2.0
>>>
>>> Unless required by applicable law or agreed to in writing,
>>> software distributed under the License is distributed on an
>>> "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
>>> KIND, either express or implied.  See the License for the
>>> specific language governing permissions and limitations
>>> under the License.
>>> -->
>>> <project xmlns="http://maven.apache.org/POM/4.0.0"; 
>>> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
>>>          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
>>> http://maven.apache.org/xsd/maven-4.0.0.xsd";>
>>>     <modelVersion>4.0.0</modelVersion>
>>>
>>>     <groupId>com.harsh.test</groupId>
>>>     <artifactId>harsh-flink-test</artifactId>
>>>     <version>1.0-SNAPSHOT</version>
>>>     <packaging>jar</packaging>
>>>
>>>     <name>Flink Quickstart Job</name>
>>>     <url>http://www.myorganization.org</url>
>>>
>>>     <properties>
>>>         <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>>>         <flink.version>1.13.2</flink.version>
>>>         <java.version>1.8</java.version>
>>>         <hive.version>2.3.6</hive.version>
>>>         <scala.binary.version>2.12</scala.binary.version>
>>>         <maven.compiler.source>${java.version}</maven.compiler.source>
>>>         <maven.compiler.target>${java.version}</maven.compiler.target>
>>>     </properties>
>>>
>>>     <repositories>
>>>         <repository>
>>>             <id>apache.snapshots</id>
>>>             <name>Apache Development Snapshot Repository</name>
>>>             
>>> <url>https://repository.apache.org/content/repositories/snapshots/</url>
>>>             <releases>
>>>                 <enabled>false</enabled>
>>>             </releases>
>>>             <snapshots>
>>>                 <enabled>true</enabled>
>>>             </snapshots>
>>>         </repository>
>>>     </repositories>
>>>
>>>     <dependencies>
>>>         <!-- Apache Flink dependencies -->
>>>         <!-- These dependencies are provided, because they should not be 
>>> packaged into the JAR file. -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             <artifactId>flink-java</artifactId>
>>>             <version>${flink.version}</version>
>>>         </dependency>
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             
>>> <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
>>>             <version>${flink.version}</version>
>>>         </dependency>
>>>
>>>         <!-- Add connector dependencies here. They must be in the default 
>>> scope (compile). -->
>>>
>>>         <!-- Example:
>>>
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             
>>> <artifactId>flink-connector-kafka-0.10_${scala.binary.version}</artifactId>
>>>             <version>${flink.version}</version>
>>>         </dependency>
>>>         -->
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka 
>>> -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             
>>> <artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
>>>             <version>${flink.version}</version>
>>>         </dependency>
>>>
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             
>>> <artifactId>flink-table-api-java-bridge_${scala.binary.version}</artifactId>
>>>             <version>${flink.version}</version>
>>>         </dependency>
>>>
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             
>>> <artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
>>>             <version>${flink.version}</version>
>>>         </dependency>
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/org.apache.flink/flink-table-planner -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             <artifactId>flink-table-planner_2.12</artifactId>
>>>             <version>1.13.2</version>
>>>         </dependency>
>>>
>>>
>>>         <!-- Flink Dependency -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             
>>> <artifactId>flink-connector-hive_${scala.binary.version}</artifactId>
>>>             <version>${flink.version}</version>
>>>         </dependency>
>>>
>>>         <!-- Hive Dependency -->
>>>         <dependency>
>>>             <groupId>org.apache.hive</groupId>
>>>             <artifactId>hive-exec</artifactId>
>>>             <version>${hive.version}</version>
>>>         </dependency>
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/javax.servlet/javax.servlet-api -->
>>>         <dependency>
>>>             <groupId>javax.servlet</groupId>
>>>             <artifactId>javax.servlet-api</artifactId>
>>>             <version>3.1.0</version>
>>>             <scope>provided</scope>
>>>         </dependency>
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/org.apache.htrace/htrace-core4 -->
>>>         <dependency>
>>>             <groupId>org.apache.htrace</groupId>
>>>             <artifactId>htrace-core4</artifactId>
>>>             <version>4.0.1-incubating</version>
>>>         </dependency>
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/commons-configuration/commons-configuration
>>>  -->
>>>         <dependency>
>>>             <groupId>commons-configuration</groupId>
>>>             <artifactId>commons-configuration</artifactId>
>>>             <version>1.10</version>
>>>         </dependency>
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/commons-logging/commons-logging -->
>>>         <dependency>
>>>             <groupId>commons-logging</groupId>
>>>             <artifactId>commons-logging</artifactId>
>>>             <version>1.2</version>
>>>         </dependency>
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop-2 
>>> -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             <artifactId>flink-shaded-hadoop-2</artifactId>
>>>             <version>2.8.3-10.0</version>
>>>         </dependency>
>>>
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/org.apache.flink/flink-hadoop-compatibility
>>>  -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             <artifactId>flink-hadoop-compatibility_2.12</artifactId>
>>>             <version>1.13.2</version>
>>>         </dependency>
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/org.apache.flink/flink-hadoop-fs -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             <artifactId>flink-hadoop-fs</artifactId>
>>>             <version>1.13.2</version>
>>>         </dependency>
>>>
>>>         <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-csv 
>>> -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             <artifactId>flink-csv</artifactId>
>>>             <version>1.13.2</version>
>>>         </dependency>
>>>
>>>         <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-json 
>>> -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             <artifactId>flink-json</artifactId>
>>>             <version>1.13.2</version>
>>>         </dependency>
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-hive-1.2.2
>>>  -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             <artifactId>flink-sql-connector-hive-2.3.6_2.12</artifactId>
>>>             <version>1.13.2</version>
>>>         </dependency>
>>>
>>>         <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-core 
>>> -->
>>>         <dependency>
>>>             <groupId>org.apache.flink</groupId>
>>>             <artifactId>flink-core</artifactId>
>>>             <version>1.13.2</version>
>>>         </dependency>
>>>
>>>         <!-- 
>>> https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
>>>         <dependency>
>>>             <groupId>org.apache.kafka</groupId>
>>>             <artifactId>kafka-clients</artifactId>
>>>             <version>2.8.0</version>
>>>         </dependency>
>>>
>>>         <dependency>
>>>             <groupId>org.apache.kafka</groupId>
>>>             <artifactId>kafka_2.12</artifactId>
>>>             <version>2.8.0</version>
>>>         </dependency>
>>>
>>>
>>>         <!-- Add logging framework, to produce console output when running 
>>> in the IDE. -->
>>>         <!-- These dependencies are excluded from the application JAR by 
>>> default. -->
>>>         <dependency>
>>>             <groupId>org.slf4j</groupId>
>>>             <artifactId>slf4j-log4j12</artifactId>
>>>             <version>1.7.7</version>
>>>             <scope>runtime</scope>
>>>         </dependency>
>>>         <dependency>
>>>             <groupId>log4j</groupId>
>>>             <artifactId>log4j</artifactId>
>>>             <version>1.2.17</version>
>>>             <scope>runtime</scope>
>>>         </dependency>
>>>     </dependencies>
>>>
>>>     <build>
>>>         <plugins>
>>>
>>>             <!-- Java Compiler -->
>>>             <plugin>
>>>                 <groupId>org.apache.maven.plugins</groupId>
>>>                 <artifactId>maven-compiler-plugin</artifactId>
>>>                 <version>3.1</version>
>>>                 <configuration>
>>>                     <source>${java.version}</source>
>>>                     <target>${java.version}</target>
>>>                 </configuration>
>>>             </plugin>
>>>
>>>             <!-- We use the maven-shade plugin to create a fat jar that 
>>> contains all necessary dependencies. -->
>>>             <!-- Change the value of <mainClass>...</mainClass> if your 
>>> program entry point changes. -->
>>>             <plugin>
>>>                 <groupId>org.apache.maven.plugins</groupId>
>>>                 <artifactId>maven-shade-plugin</artifactId>
>>>                 <version>3.0.0</version>
>>>                 <executions>
>>>                     <!-- Run shade goal on package phase -->
>>>                     <execution>
>>>                         <phase>package</phase>
>>>                         <goals>
>>>                             <goal>shade</goal>
>>>                         </goals>
>>>                         <configuration>
>>>                             <artifactSet>
>>>                                 <excludes>
>>>                                     
>>> <exclude>org.apache.flink:force-shading</exclude>
>>>                                     
>>> <exclude>com.google.code.findbugs:jsr305</exclude>
>>>                                     <exclude>org.slf4j:*</exclude>
>>>                                     <exclude>log4j:*</exclude>
>>>                                 </excludes>
>>>                             </artifactSet>
>>>                             <filters>
>>>                                 <filter>
>>>                                     <!-- Do not copy the signatures in the 
>>> META-INF folder.
>>>                                     Otherwise, this might cause 
>>> SecurityExceptions when using the JAR. -->
>>>                                     <artifact>*:*</artifact>
>>>                                     <excludes>
>>>                                         <exclude>META-INF/*.SF</exclude>
>>>                                         <exclude>META-INF/*.DSA</exclude>
>>>                                         <exclude>META-INF/*.RSA</exclude>
>>>                                     </excludes>
>>>                                 </filter>
>>>                             </filters>
>>>                             <transformers>
>>>                                 <transformer 
>>> implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
>>>                                     
>>> <mainClass>com.harsh.test.StreamingJob</mainClass>
>>>                                 </transformer>
>>>                             </transformers>
>>>                         </configuration>
>>>                     </execution>
>>>                 </executions>
>>>             </plugin>
>>>         </plugins>
>>>
>>>         <pluginManagement>
>>>             <plugins>
>>>
>>>                 <!-- This improves the out-of-the-box experience in Eclipse 
>>> by resolving some warnings. -->
>>>                 <plugin>
>>>                     <groupId>org.eclipse.m2e</groupId>
>>>                     <artifactId>lifecycle-mapping</artifactId>
>>>                     <version>1.0.0</version>
>>>                     <configuration>
>>>                         <lifecycleMappingMetadata>
>>>                             <pluginExecutions>
>>>                                 <pluginExecution>
>>>                                     <pluginExecutionFilter>
>>>                                         
>>> <groupId>org.apache.maven.plugins</groupId>
>>>                                         
>>> <artifactId>maven-shade-plugin</artifactId>
>>>                                         
>>> <versionRange>[3.0.0,)</versionRange>
>>>                                         <goals>
>>>                                             <goal>shade</goal>
>>>                                         </goals>
>>>                                     </pluginExecutionFilter>
>>>                                     <action>
>>>                                         <ignore/>
>>>                                     </action>
>>>                                 </pluginExecution>
>>>                                 <pluginExecution>
>>>                                     <pluginExecutionFilter>
>>>                                         
>>> <groupId>org.apache.maven.plugins</groupId>
>>>                                         
>>> <artifactId>maven-compiler-plugin</artifactId>
>>>                                         <versionRange>[3.1,)</versionRange>
>>>                                         <goals>
>>>                                             <goal>testCompile</goal>
>>>                                             <goal>compile</goal>
>>>                                         </goals>
>>>                                     </pluginExecutionFilter>
>>>                                     <action>
>>>                                         <ignore/>
>>>                                     </action>
>>>                                 </pluginExecution>
>>>                             </pluginExecutions>
>>>                         </lifecycleMappingMetadata>
>>>                     </configuration>
>>>                 </plugin>
>>>             </plugins>
>>>         </pluginManagement>
>>>     </build>
>>>
>>>     <!-- This profile helps to make things run out of the box in IntelliJ 
>>> -->
>>>     <!-- Its adds Flink's core classes to the runtime class path. -->
>>>     <!-- Otherwise they are missing in IntelliJ, because the dependency is 
>>> 'provided' -->
>>>     <profiles>
>>>         <profile>
>>>             <id>add-dependencies-for-IDEA</id>
>>>
>>>             <activation>
>>>                 <property>
>>>                     <name>idea.version</name>
>>>                 </property>
>>>             </activation>
>>>
>>>             <dependencies>
>>>                 <dependency>
>>>                     <groupId>org.apache.flink</groupId>
>>>                     <artifactId>flink-java</artifactId>
>>>                     <version>${flink.version}</version>
>>>                     <scope>compile</scope>
>>>                 </dependency>
>>>                 <dependency>
>>>                     <groupId>org.apache.flink</groupId>
>>>                     
>>> <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
>>>                     <version>${flink.version}</version>
>>>                     <scope>compile</scope>
>>>                 </dependency>
>>>             </dependencies>
>>>         </profile>
>>>     </profiles>
>>>
>>> </project>
>>>
>>>
>>> Please help me resolve the issue.
>>>
>>> Thanks
>>>
>>>
>>>
>>>
>
> --
> Thanks and Regards,
> Harshvardhan
> Data Platform
>

Reply via email to