Maneendra created FLINK-36765:
---------------------------------

             Summary: How to Handle Multi-Type Maps in Avro Schema with Flink 
Table API?
                 Key: FLINK-36765
                 URL: https://issues.apache.org/jira/browse/FLINK-36765
             Project: Flink
          Issue Type: Bug
          Components: API / Type Serialization System, Formats (JSON, Avro, 
Parquet, ORC, SequenceFile)
            Reporter: Maneendra


 


I have a Map with multiple data types in my Avro schema, which I am trying to 
use in the Flink Table API to read data from Kafka. However, I’m encountering 
the following exception because the Flink AvroSchemaConverter does not support 
Maps with mixed data types. Could someone assist me in parsing this schema 
using the Table API?

FLink Code: String avroSchema="................";

DataType s = AvroSchemaConverter.convertToDataType(avroSchema);
    Schema schema1 = Schema.newBuilder().fromRowDataType(s).build();
    
    TableDescriptor descriptor = TableDescriptor.forConnector("kafka")
            .schema(schema)
            .comment("simple comment")
            .option("topic", "****")
            .option("properties.application.id", "****")
            .option("properties.security.protocol", "********")
            .option("properties.bootstrap.servers", "********")
            .option("properties.group.id", "********")
            .option("properties.auto.offset.reset", "earliest")
            .option("format", "avro")
            .build();
Avro Schema:
{
   "name":"standByProperties",
   "type":[
      "null",
      {
         "type":"map",
         "values":[
            "null",
            "boolean",
            "int"
         ]
      }
   ]
},

Output: standByProperties MAP<STRING NOT NULL, RAW('java.lang.Object', ?) NOT 
NULL> Exception: Exception in thread "main" 
java.lang.UnsupportedOperationException: Unsupported to derive Schema for type: 
RAW('java.lang.Object', ?) NOT NULL at 
org.apache.flink.formats.avro.typeutils.AvroSchemaConverter.convertToSchema(AvroSchemaConverter.java:580)
 at 
org.apache.flink.formats.avro.typeutils.AvroSchemaConverter.convertToSchema(AvroSchemaConverter.java:416)
 at 
org.apache.flink.formats.avro.typeutils.AvroSchemaConverter.convertToSchema(AvroSchemaConverter.java:568)
 at 
org.apache.flink.formats.avro.typeutils.AvroSchemaConverter.convertToSchema(AvroSchemaConverter.java:549)
 at 
org.apache.flink.formats.avro.typeutils.AvroSchemaConverter.convertToSchema(AvroSchemaConverter.java:416)

What I Tried: I defined an Avro schema that includes a Map field with values of 
mixed data types. Used the Flink Table API to read data from Kafka and 
attempted to use AvroSchemaConverter to map the schema to a Flink table. During 
execution, I encountered an exception because the AvroSchemaConverter does not 
support Maps with multiple value types. What I Was Expecting: I was expecting 
Flink to handle the Map field and correctly parse the data into a table format, 
with proper support for the mixed data types within the Map.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to