Hi,
thanks for your feedback. I agree that the the current interfaces are
not flexible enough to fit to every use case. The unified connector API
is a a very recent feature that still needs some polishing. I'm working
on a design document to improve the situation there.
For now, you can simply implement some utitilty method that just
iterates over column names and types of TableSchema and calls
`schema.field(name, type)`
I hope this helps.
Regards,
Timo
Am 31.08.18 um 07:40 schrieb françois lacombe:
Hi all,
Today I'm looking into derivating an Avro schema json string into a
Schema object.
In the overview of
https://ci.apache.org/projects/flink/flink-docs-release-1.6/dev/table/connect.html
Avro is used as a format and never as a schema.
This was a topic in JIRA-9813
I can get a TableSchema with TableSchema schema =
TableSchema.fromTypeInfo(AvroSchemaConverter.convertToTypeInfo(sch_csv.toString()));
but I can't use it with BatchTableDescriptor.withSchema().
How can I get a Schema from TableSchema, TypeInformation<?>[] or even
Avro json string?
A little bridge is missing between TableSchema and
org.apache.flink.table.descriptors.Schema it seems.
Thanks in advance for any useful hint
François