I added kafka tomy  dependencies although i am not sure why this would be
required... seems to work

                <dependency>
                        <groupId>org.apache.kafka</groupId>
                        <artifactId>kafka_${kafka.scala.version}</artifactId>
                        <version>${kafka.version}</version>
                </dependency>

This is my full dependency list...

<dependencies>
                
                
                <dependency>
                        <groupId>org.apache.flink</groupId>
                        <artifactId>flink-java</artifactId>
                        <version>${flink.version}</version>
                        <scope>provided</scope>
                </dependency>
                <dependency>
                        <groupId>org.apache.flink</groupId>
                        
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
                        <version>${flink.version}</version>
                        <scope>provided</scope>
                </dependency>

                
                <dependency>
                        <groupId>org.apache.flink</groupId>
                
<artifactId>flink-connector-kafka-0.11_${scala.binary.version}</artifactId>
                        <version>${flink.version}</version>
                </dependency>

                <dependency>
                        <groupId>org.apache.flink</groupId>
                
<artifactId>flink-statebackend-rocksdb_${scala.binary.version}</artifactId>
                        <version>${flink.version}</version>
                </dependency>

                
                
                <dependency>
                        <groupId>org.slf4j</groupId>
                        <artifactId>slf4j-log4j12</artifactId>
                        <version>1.7.7</version>
                        <scope>runtime</scope>
                </dependency>
                <dependency>
                        <groupId>log4j</groupId>
                        <artifactId>log4j</artifactId>
                        <version>1.2.17</version>
                        <scope>runtime</scope>
                </dependency>
                <dependency>
                        <groupId>eu.neurocom</groupId>
                        <artifactId>mip-model-poc</artifactId>
                        <version>1.0</version>
                </dependency>
                <dependency>
                        <groupId>org.apache.flink</groupId>
                        <artifactId>flink-avro</artifactId>
                        <version>${flink.version}</version>
                </dependency>
                <dependency>
                        <groupId>io.confluent</groupId>
                        <artifactId>kafka-avro-serializer</artifactId>
                        <version>4.0.0</version>
                </dependency>
                <dependency>
                        <groupId>org.apache.kafka</groupId>
                        <artifactId>kafka_${kafka.scala.version}</artifactId>
                        <version>${kafka.version}</version>
                </dependency>
        </dependencies>

This does solve the issue but now i am getting the folowing error...


java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record
cannot be cast to eu.neurocom.avro.CelloAvro
        at
eu.neurocom.schema.ConfluentAvroDeserializationSchema.deserialize(ConfluentAvroDeserializationSchema.java:37)
        at
eu.neurocom.schema.ConfluentAvroDeserializationSchema.deserialize(ConfluentAvroDeserializationSchema.java:16)
        at
org.apache.flink.streaming.util.serialization.KeyedDeserializationSchemaWrapper.deserialize(KeyedDeserializationSchemaWrapper.java:42)
        at
org.apache.flink.streaming.connectors.kafka.internal.Kafka09Fetcher.runFetchLoop(Kafka09Fetcher.java:139)
        at
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:652)





--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

Reply via email to