Hi,
I'm currently working on improving the testing process of flink streaming
applications.
I have written a test runtime that takes care of execution, collecting the
output and applying a verifier to it. The runtime is able to provide test
sources and sinks that run in parallel.
On top of that i
Hi Nick,
This is easily achievable using the framework I provide.
createDataStream(Input input) does actually return a DataStreamSource.
And the call of assertStream(DataStream datastream, OutputMatcher
matcher) just attaches a TestSink to the datastream, but you can create
the test sink manually
Hi,
I'm getting an error when using .fromElements() of the
StreamExecutionEnvironment or ExectutionEnvironment of the scala api:
Error:(53, 54) could not find implicit value for evidence parameter of type
org.apache.flink.api.common.typeinfo.TypeInformation[Int]
val source : DataSet[Int] = en
Thanks!
I should've mentioned that I've seen the FAQ but I didn't notice intellij
deleting the import immediately.
For anyone encountering a similar behavior:
http://stackoverflow.com/questions/11154912/how-to-prevent-intellij-idea-from-deleting-unused-packages
Note that you have uncheck "opitimiz
Hi,
If you wan't to play with it you can find the source and basic documentation
here: https://github.com/ottogroup/flink-spector.
The framework is for now feature complete. At the moment I'm working on
exposing some more functionality to the user, making the dsl more intuitive
and scalatest suppo
Hi,
I'm currently updating and improving the documentation[1] of flink-spector.
Regarding missing examples: I'm planning to include small examples showing
the behaviour of output matchers, as the documentation already includes
several demonstrating how to assemble test cases. Please let me know, i
ataStreamSpec]https://github.com/lofifnc/flink-spector/blob/scala_api/flinkspector-datastream-scala_2.11/src/test/scala/org/flinkspector/scala/datastream/DataStreamSpec.scala.
(This is works but has also not really been tested)
Best,
Alex
--
View this message in context:
http://apache-flink-user
Hi,
I'm trying to figure out what graph the execution plan represents when you
call env.getExecutionPlan on the StreamExecutionEnvironment. From my
understanding the StreamGraph is what you call an APIGraph, which will be
used to create the JobGraph.
So is the ExecutionPlan is a full representatio
Hi Márton,
Thanks for your answer. But now I'm even more confused as it somehow
conflicts with the documentation. ;)
According to the wiki and the stratosphere paper the JobGraph will be
submitted to the JobManager. And the JobManager will then translate it into
the ExecutionGraph.
> In order to
Hi,
I have a setup where I'm feeding a rolling window with event time:
https://gist.github.com/lofifnc/dd946fef6f4b3eb25ef1 (Obviously i'm using
Flinkspector)
The first case behaves as expected I'm emitting three records which are all
in the time frame of the first window tri
I should add i'm using version 0.10.1
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Trying-to-comprehend-rolling-windows-event-time-tp5034p5035.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at
Nabble.com.
Hi,
You're right, expect that ("grace", "arctic", 25) is emitted with timestamp
90 seconds along with a for watermark 90 seconds.
I followed your advice and implemented a simple window function printing the
start + end of a window along with it's content. You can see that a window
from minute 1 t
completely into 5
different windows, thus will be evaluated 5 times.
I have created another gist, where the start and end time of each window is
shown along with it's content.
The single record I'm emitting into the data flow will fit into 5 different
overlapping windows.
https://gist.github.c
Hi,
Some shameless self promotion:
You can also checkout:
https://github.com/ottogroup/flink-spector
which has to the goal to remove such hurdles when testing flink programs.
Best,
Alex
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/wr
Hi,
Flinkspector is indeed a good choice to circumvent this problem as it
specifically has several mechanisms to deal with these synchronization
problems. Unfortunately, I'm still looking for a reasonable solution to
support checking of scala types.
Maybe I will provide a version in which you can
15 matches
Mail list logo