Hi Filipe,

The problem your encountering most likely stems from the fact that Flink
serializes all operators before
running them in the (local) cluster. During this process all outside
references inside your sink are lost.

In the thread you've found are two solutions for this: Use the collect sink: 
contrib/streaming/DataStreamUtils.java
<https://github.com/apache/flink/blob/master/flink-contrib/flink-streaming-contrib/src/main/java/org/apache/flink/contrib/streaming/DataStreamUtils.java>
  
which provides you with an iterator containing the results. I've no idea if
this works with the current version of Flink or even with the scala api
DataStream class.

And the solution from nick who directly uses a local static variable for the
results:  https://gist.github.com/ndimiduk/5f3b4757eb772feed6e6
<https://gist.github.com/ndimiduk/5f3b4757eb772feed6e6>   you have to set
the parallelism to 1 as you've discovered. (Haven't tested this either)

The third solution would be of course to use flink-spector, which takes care
of these issues. But I have currently no time to finish support for scala.
You can find an example and the current state here:
[DataStreamSpec]https://github.com/lofifnc/flink-spector/blob/scala_api/flinkspector-datastream-scala_2.11/src/test/scala/org/flinkspector/scala/datastream/DataStreamSpec.scala.
(This is works but has also not really been tested)

Best,
Alex






--
View this message in context: 
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Unit-testing-support-for-flink-application-tp4130p4189.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at 
Nabble.com.

Reply via email to