Thanks @Ruben for providing those Information. It will helpful for me.
Seems You have written your own InputSplit for this task. (KuduInputSplit)
On Thu, Jan 19, 2017 at 2:58 PM, wrote:
> Hi,
>
> Just in case it could useful, we are working in Flink-Kudu integration
> [1]. This is a still Work i
Hi,
Just in case it could useful, we are working in Flink-Kudu integration [1].
This is a still Work in Progess but we had to implemente an InputFormat to read
from Kudu tables so maybe the code is useful for you [2]
Best
[1] https://github.com/rubencasado/Flink-Kudu
[2]
https://github.com/ru
Hi,
When we are implementing that InputFormat Interface, if we have that Input
split part in our data analytics server APIs can we directly go to the
second phase that you have described earlier?
Since Our data source has database tables architecture I have a thought of
follow that 'JDBCInputF
Hi Fabian,
Thanks for providing those information.
On Mon, Jan 16, 2017 at 2:36 PM, Fabian Hueske wrote:
> Hi Pawan,
>
> this sounds like you need to implement a custom InputFormat [1].
> An InputFormat is basically executed in two phases. In the first phase it
> generates InputSplits. An InputS
Hi Pawan,
this sounds like you need to implement a custom InputFormat [1].
An InputFormat is basically executed in two phases. In the first phase it
generates InputSplits. An InputSplit references a a chunk of data that
needs to be read. Hence, InputSplits define how the input data is split to
be
Hi,
we have a data analytics server that has analytics data tables. So I need
to write a custom *Java* implementation for read data from that data source
and do processing (*batch* processing) using Apache Flink. Basically it's
like a new client connector for Flink.
So It would be great if you ca