Re: Flink 1.12 cannot handle large schema

2020-10-02 Thread Lian Jiang
Appreciate Arvid for the jira and the workaround. I will monitor the jira status and retry when the fix is available. I can help test the fix when it is in a private branch. Thanks. Regards! On Fri, Oct 2, 2020 at 3:57 AM Arvid Heise wrote: > Hi Lian, > > Thank you for reporting. It looks like a

Re: Flink 1.12 cannot handle large schema

2020-10-02 Thread Arvid Heise
Hi Lian, Thank you for reporting. It looks like a bug to me and I created a ticket [1]. You have two options: wait for the fix or implement the fix yourself (copy AvroSerializerSnapshot and use another way to write/read the schema), then subclass AvroSerializer to use your snapshot. Of course, we

Flink 1.12 cannot handle large schema

2020-10-01 Thread Lian Jiang
Hi, I am using Flink 1.12 snapshot built on my machine. My job throws an exception when writeUTF a schema from the schema registry. Caused by: java.io.UTFDataFormatException: encoded string too long: 223502 bytes at java.io.DataOutputStream.writeUTF(DataOutputStream.java:364) at java.io.DataOutpu