Re: Strange exception in UnitTest when migrate Flink v1.16.2 to v1.17.1

2023-09-06 Thread Teunissen, F.G.J. (Fred) via user
1.16 to 1.17, and that workaround resolved the test failures. Hope this helps. Regards Aniket From: Teunissen, F.G.J. (Fred) via user Sent: Wednesday, September 6, 2023 6:45 AM To: user@flink.apache.org Subject: Strange exception in UnitTest when migrate Flink v1.16.2 to v1.17.1

Strange exception in UnitTest when migrate Flink v1.16.2 to v1.17.1

2023-09-06 Thread Teunissen, F.G.J. (Fred) via user
Hi community, I would like to ask for some help in solving a strange failure in a Unit Test when code coverage (jacoco) is enabled. We have a project with a custom UDF that uses the MiniClusterExtension in a Unit Test. The Unit Test works fine when built for Flink v1.16.2, but it fails when bu

Should the StreamingFileSink mark the files "finished" when all bounded input sources are depleted?

2020-09-07 Thread Teunissen, F.G.J. (Fred)
Hi All, My flink-job is using bounded input sources and writes the results to a StreamingFileSink. When it has processed all the input the job is finished and closes. But the output files are still named “-0-0..inprogress.”. I expected them to be named ““-0-0.”. Did I forget some setting or so

State Processor API not working with Scala based Flink Jobs

2019-10-02 Thread Teunissen, F.G.J. (Fred)
Hi All, We have build a Flink Job using scala. In one specific operator (CoProcessFunction based) we store data in a MapState. The input streams are keyed by value of type ‘Seq[(String, CustomClassHierarchy)]’. When I try to read a savepoint with the State Processor API I get some ‘Incompatib