[ 
https://issues.apache.org/jira/browse/FLINK-9964?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aljoscha Krettek closed FLINK-9964.
-----------------------------------
      Resolution: Fixed
    Release Note: 
This release introduces a new format descriptor for CSV files that is compliant
with RFC 4180. The new descriptor is available as
`org.apache.flink.table.descriptors.Csv`. For now, this can only be used
together with the Kafka connector. The old descriptor is availabla as
`org.apache.flink.table.descriptors.OldCsv` for use with file system connectors.


  was:The descriptor o.a.f.table.descriptors.Csv does not describe Flink's old 
non-standard CSV table source/sink anymore. Instead, it can be used when 
writing to Kafka. The old one is still available under 
"org.apache.flink.table.descriptors.OldCsv" for stream/batch filesystem 
operations.


> Add a CSV table format factory
> ------------------------------
>
>                 Key: FLINK-9964
>                 URL: https://issues.apache.org/jira/browse/FLINK-9964
>             Project: Flink
>          Issue Type: Sub-task
>          Components: Table API & SQL
>            Reporter: Timo Walther
>            Assignee: bupt_ljy
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 1.8.0
>
>          Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> We should add a RFC 4180 compliant CSV table format factory to read and write 
> data into Kafka and other connectors. This requires a 
> {{SerializationSchemaFactory}} and {{DeserializationSchemaFactory}}. How we 
> want to represent all data types and nested types is still up for discussion. 
> For example, we could flatten and deflatten nested types as it is done 
> [here|http://support.gnip.com/articles/json2csv.html]. We can also have a 
> look how tools such as the Avro to CSV tool perform the conversion.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to