Thanks for all your input. The design document covers the use cases we have in mind and querying external sources may be interesting to us for other uses not mentioned in the first mail.
I will wait for developments in this direction, because the expected result seems promising. :) Thank you again, Simone 2016-04-21 14:41 GMT+02:00 Fabian Hueske <fhue...@gmail.com>: > Hi Simone, > > in Flink 1.0.x, the Table API does not support reading external data, > i.e., it is not possible to read a CSV file directly from the Table API. > Tables can only be created from DataSet or DataStream which means that the > data is already converted into "Flink types". > > However, the Table API is currently under heavy development as part of the > the efforts to add SQL support. > This work is taking place on the master branch and I am currently working > on interfaces to scan external data sets or ingest external data streams. > The interface will be quite generic such that it should be possible to > define a table source that reads the first lines of a file to infer > attribute names and types. > You can have a look at the current state of the API design here [1]. > > Feedback is welcome and can be very easily included in this phase of the > development ;-) > > Cheers, Fabian > > [1] > https://docs.google.com/document/d/1sITIShmJMGegzAjGqFuwiN_iw1urwykKsLiacokxSw0 > <https://docs.google.com/document/d/1sITIShmJMGegzAjGqFuwiN_iw1urwykKsLiacokxSw0/edit#> > > 2016-04-21 14:26 GMT+02:00 Simone Robutti <simone.robu...@radicalbit.io>: > >> Hello, >> >> I would like to know if it's possible to create a Flink Table from an >> arbitrary CSV (or any other form of tabular data) without doing type safe >> parsing with expliciteky type classes/POJOs. >> >> To my knowledge this is not possible but I would like to know if I'm >> missing something. My requirement is to be able to read a CSV file and >> manipulate it reading the field names from the file and inferring data >> types. >> >> Thanks, >> >> Simone >> > >