Appreciate Arvid for the jira and the workaround. I will monitor the jira
status and retry when the fix is available. I can help test the fix when it
is in a private branch. Thanks. Regards!
On Fri, Oct 2, 2020 at 3:57 AM Arvid Heise wrote:
> Hi Lian,
>
> Thank you for reporting. It looks like a
Hi Lian,
Thank you for reporting. It looks like a bug to me and I created a ticket
[1].
You have two options: wait for the fix or implement the fix yourself (copy
AvroSerializerSnapshot and use another way to write/read the schema), then
subclass AvroSerializer to use your snapshot. Of course, we
Hi,
I am using Flink 1.12 snapshot built on my machine. My job throws an
exception when writeUTF a schema from the schema registry.
Caused by: java.io.UTFDataFormatException: encoded string too long: 223502
bytes
at java.io.DataOutputStream.writeUTF(DataOutputStream.java:364)
at java.io.DataOutpu