[ https://issues.apache.org/jira/browse/FLINK-9813?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16555869#comment-16555869 ]
François Lacombe commented on FLINK-9813: ----------------------------------------- Thank you [~twalthr] I'll wait until 1.6 be released One last question, I see in flink.table.descriptors the CsvValidator class. What is its purpose ? Can it be useful to check the content of the CSV or only the format validity? > Build xTableSource from Avro schemas > ------------------------------------ > > Key: FLINK-9813 > URL: https://issues.apache.org/jira/browse/FLINK-9813 > Project: Flink > Issue Type: Wish > Components: Table API & SQL > Affects Versions: 1.5.0 > Reporter: François Lacombe > Priority: Trivial > Original Estimate: 48h > Remaining Estimate: 48h > > As Avro provide efficient data schemas formalism, it may be great to be able > to build Flink Tables Sources with such files. > More info about Avro schemas > :[https://avro.apache.org/docs/1.8.1/spec.html#schemas] > For instance, with CsvTableSource : > Parser schemaParser = new Schema.Parser(); > Schema tableSchema = schemaParser.parse("avro.json"); > Builder bld = CsvTableSource.builder().schema(tableSchema); > > This would give me a fully available CsvTableSource with columns defined in > avro.json > It may be possible to do so for every TableSources since avro format is > really common and versatile. -- This message was sent by Atlassian JIRA (v7.6.3#76005)