Hi,
Thanks a lot for Fabian and Flavio.Those information really helpful.
On Tue, Jan 24, 2017 at 3:36 PM, Flavio Pompermaier
wrote:
> If your column on which you want to perform the split is numeric you can
> use the NumericBetweenParametersProvider interface that automatically
> computes th
If your column on which you want to perform the split is numeric you can
use the NumericBetweenParametersProvider interface that automatically
computes the splits for you. This is an example of its usage (at windows of
1000 items at a time) taken from the test class *JDBCInputFormatTest*:
final in
Hi,
JdbcInputFormat implements the InputFormat interface and is handled exactly
like any other InputFormat.
In contrast to file-based input formats, users must explicitly specify the
input splits by providing an array of parameter values which are injected
into a parameterized query.
This is done
Hi,
Thanks for your help. Since Our data source has database tables
architecture I have a thought of follow that 'JDBCInputFormat' in Flink. It
would be great if you can provide some information regarding how that
JDBCInputFormat execution happens?
Thanks,
Pawan
On Mon, Jan 23, 2017 at 4:18 PM, F
Hi Pawan,
I don't this this works. The InputSplits are generated by the JobManager,
i.e., not in parallel by a single process.
After the parallel InputFormats have been started on the TaskManagers, they
request InputSplits and open() them. If there are no InputSplits there is
no work to be done an
Hi,
When we are implementing that Flink *InputFormat* Interface, if we have that*
input split creation* part in our data analytics server APIs can we
directly go to the second phase of the flink InputFormat Interface
execution.
Basically I need to know that can we read those InputSplits directly,