[ https://issues.apache.org/jira/browse/FLINK-20470?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Jark Wu closed FLINK-20470. --------------------------- Fix Version/s: 1.12.1 1.13.0 Resolution: Fixed Fixed in - master: 29e122d539caebe9dd5717439eda12b252e26d41 - releaes-1.12: 9a2d4c2e23de074e0aafebf72c23d5a95cf76614 > MissingNode can't be casted to ObjectNode when deserializing JSON > ----------------------------------------------------------------- > > Key: FLINK-20470 > URL: https://issues.apache.org/jira/browse/FLINK-20470 > Project: Flink > Issue Type: Bug > Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile), Table > SQL / Ecosystem > Affects Versions: 1.12.0, 1.11.2 > Reporter: Jark Wu > Assignee: zhuxiaoshang > Priority: Major > Labels: pull-request-available > Fix For: 1.13.0, 1.12.1 > > > {code} > Caused by: java.io.IOException: Failed to deserialize JSON ''. > at > org.apache.flink.formats.json.JsonRowDataDeserializationSchema.deserialize(JsonRowDataDeserializationSchema.java:126) > ~[flink-json-1.11.2.jar:1.11.2] > at > org.apache.flink.formats.json.JsonRowDataDeserializationSchema.deserialize(JsonRowDataDeserializationSchema.java:76) > ~[flink-json-1.11.2.jar:1.11.2] > at > org.apache.flink.api.common.serialization.DeserializationSchema.deserialize(DeserializationSchema.java:81) > ~[flink-dist_2.11-1.11.2.jar:1.11.2] > at > org.apache.flink.streaming.connectors.kafka.internals.KafkaDeserializationSchemaWrapper.deserialize(KafkaDeserializationSchemaWrapper.java:56) > ~[flink-sql-connector-kafka_2.11-1.11.2.jar:1.11.2] > at > org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:181) > ~[flink-sql-connector-kafka_2.11-1.11.2.jar:1.11.2] > at > org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:141) > ~[flink-sql-connector-kafka_2.11-1.11.2.jar:1.11.2] > at > org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:755) > ~[flink-sql-connector-kafka_2.11-1.11.2.jar:1.11.2] > at > org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100) > ~[flink-dist_2.11-1.11.2.jar:1.11.2] > at > org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63) > ~[flink-dist_2.11-1.11.2.jar:1.11.2] > at > org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:213) > ~[flink-dist_2.11-1.11.2.jar:1.11.2] > Caused by: java.lang.ClassCastException: > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.MissingNode > cannot be cast to > org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode > {code} > Currently, we only check {{jsonNode == null || jsonNode.isNull()}} for > nullable node, I think we should also take MissingNode into account. -- This message was sent by Atlassian Jira (v8.3.4#803005)