Hi Timo, David

Thx for your quick answer

BR
Jose

On Thu, 13 Sep 2018 at 12:41, Timo Walther <twal...@apache.org> wrote:

> Hi Jose,
>
> you have to add additional Maven modules depending on the connector/format
> you want to use. See this page [1] for more information.
>
> Feel free to ask further questions if the description is not enough for
> you.
>
> Regards,
> Timo
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-release-1.6/dev/table/connect.html#further-tablesources-and-tablesinks
>
>
> Am 13.09.18 um 11:50 schrieb jose farfan:
>
> Hi
>
> I am checking the documentation
>
>
> https://ci.apache.org/projects/flink/flink-docs-release-1.6/dev/table/common.html#register-a-tablesink
>
> Register a TableSink
>
> A registered TableSink can be used to emit the result of a Table API or
> SQL query
> <https://ci.apache.org/projects/flink/flink-docs-master/dev/table/common.html#emit-a-table>
>  to
> an external storage system, such as a database, key-value store, message
> queue, or file system (in different encodings, e.g., CSV, Apache [Parquet,
> Avro, ORC], …).
>
> Flink aims to provide TableSinks for common data formats and storage
> systems. Please see the documentation about Table Sources and Sinks
> <https://ci.apache.org/projects/flink/flink-docs-master/dev/table/sourceSinks.html>
>  page
> for details about available sinks and instructions for how to implement a
> custom TableSink.
>
>
> You can read that we can define different encodigos CSV, ORC, etc.
>
>
> But in the source code, I can only find CsvTableFlink
>
> How I can get a OrcTableFlink? Do I need to extend the TableSinkBase, of
> there is another place to find that implementation
>
>
> BR
>
> Jose
>
>
>
>
>

Reply via email to