Hi,
Thanks @Fabian and @Xingcan for the explanation.
@Xingcan Here I mean I have a data analytics server that has *data tables*.
So my initial requirement is to make a client connector for Flink to access
that* data tables*.Then I started with implementing Flink InputFormat
Interface and that was
Hi Pawan,
@Fabian was right and I thought it was stream environment. Sorry for that.
What do you mean by `read the available records of my datasource`? How do
you implement the nextRecord() method in DASInputFormat?
Best,
Xingcan
On Wed, Mar 1, 2017 at 4:45 PM, Fabian Hueske wrote:
> Hi Pawa
Hi Pawan,
in the DataSet API DataSet.print() will trigger the execution (you do not
need to call ExecutionEnvironment.execute()).
The DataSet will be printed on the standard out of the process that submits
the program. This does only work for small DataSets.
In general print() should only be used
Hi,
So how can I read the available records of my datasource. I saw in some
examples that print() method will print the available data of that
datasource. ( like files )
Thanks,
Pawan
On Wed, Mar 1, 2017 at 11:30 AM, Xingcan Cui wrote:
> Hi Pawan,
>
> in Flink, most of the methods for DataSet
Hi Pawan,
in Flink, most of the methods for DataSet (including print()) will just add
operators to the plan but not really run it. If the DASInputFormat has no
error, you can run the plan by calling environment.execute().
Best,
Xingcan
On Wed, Mar 1, 2017 at 12:17 PM, Pawan Manishka Gunarathna <
Hi,
I have implemented a Flink InputFormat interface related to my datasource.
It have our own data type as *Record*. So my class seems as follows,
public class DASInputFormat implements InputFormat {
}
So when I executed the print() method, my console shows the Flink execution,
but nothing will