Thank you all for you answers.
It's ok with BatchTableSource
All the best
François
2018-08-26 17:40 GMT+02:00 Rong Rong :
> Yes you should be able to use Row instead of Tuple in your
> BatchTableSink.
> There's sections in Flink documentation regarding mapping of data types to
> table schemas
Yes you should be able to use Row instead of Tuple in your
BatchTableSink.
There's sections in Flink documentation regarding mapping of data types to
table schemas [1]. and table can be converted into various typed DataStream
[2] as well. Hope these are helpful.
Thanks,
Rong
[1]
https://ci.apache
Hi Timo,
Thanks for your answer
I was looking for a Tuple as to feed a BatchTableSink subclass, but it
may be achived with a Row instead?
All the best
François
2018-08-24 10:21 GMT+02:00 Timo Walther :
> Hi,
>
> tuples are just a sub category of rows. Because the tuple arity is limited
> to 25
Hi,
tuples are just a sub category of rows. Because the tuple arity is
limited to 25 fields. I think the easiest solution would be to write
your own converter that maps rows to tuples if you know that you will
not need more than 25 fields. Otherwise it might be easier to just use a
TextInputF
Hi all,
I'm looking for best practices regarding Tuple instances creation.
I have a TypeInformation object produced by
AvroSchemaConverter.convertToTypeInfo("{...}");
Is this possible to define a corresponding Tuple instance with it? (get
the T from the TypeInformation)
Example :
{
"type": "re