Hi Rodrigo,
For the connectors, Pyflink just wraps the java implementation.
And I am not an expert on Avro and corresponding connectors, but as far as
I know, DataTypes really cannot declare the type of union you mentioned.
Regarding the bytes encoding you mentioned, I actually have no good
suggest
The upload of the schema through Avro(avro_schema) worked, but I had to
select one type from the union type to put in Schema.field(field_type)
inside t_env.connect(). If my dict has long and double values, and I declare
Schema.field(DataTypes.Double()), all the int values are cast to double. My
m
Thank you Xingbo.
I've managed to get it working adding the Avro jar and the three artifacts
from the *com.fasterxml.jackson.core* group [1]. Is it required to also add
the jackson-mapper-asl jar? About joda-time, I suppose that it'll not be
required, as I won't use date types in my Avro schema.
Hi Rodrigo,
Flink doesn't support an avro uber jar, so you need to add all dependency
jars manually, such as avro, jackson-core-asl, jackson-mapper-asl and
joda-time in release-1.11.
However, I found that there was a JIRA[1] that provided a default version
of avro uber jar a few days ago.
[1] ht