Hello,
Since ExecutionEnvironment#execute() blocks until the job is finished
you should be able to just do this:
data.writeAsText();
env.execute();
{ do Map-Job }
Note that your current solution is wrong, as it translates to this:
DataSet result = ...
result.writeAsText();
if (result.count()
Hi,
I have a program that contains a preprocessing with Flink Objects and at the
end writes the result with „result.writeAsText(„...“)“.
After that I call a method that is basically a MapReduce-Job (actually only a
Map-Job) which depends on the written file.
So what is the smartest way to dela