We solved this issue (of read the value of an accumulator) by calling a
REST endpoint after the job end, in order to store the value associated to
the accumulator in some database.
This is very awful but I didn't find any better solution..
This is the code that runs the job (of course its not comp
Oh god, if we have some code with Accumulator after the env.execute(), this
will not be executed on the JobManager too ?
Thanks, I would be interested indeed !
--
Bastien DINE
Data Architect / Software Engineer / Sysadmin
bastiendine.io
Le ven. 23 nov. 2018 à 16:37, Flavio Pompe
The problem is that the REST API block on env.execute.
If you want to run your Flink job you have to submit it using the CLI
client.
As a workaround we wrote a Spring REST API that to run a job open an SSH
connection to the job manager and execute the bin/flink run command..
If you're interested i
Hello,
I need to chain processing in DataSet API, so I am launching severals jobs,
with multiple env.execute() :
topology1.define();
env.execute;
topogy2.define();
env.execute;
This is working fine when I am running it within IntellIiJ
But when I am deploying it into my cluster, it only launch