Hi Min,

I recently published a small repository [1] containing examples of how to
test Flink applications on different levels of the testing pyramid. It also
contains one integration test, which spins up an embedded Flink cluster
[2]. In contrast to your requirements this test uses dedicated testing
sources/sinks. To include your Kafka sources/sinks in the test, I suggest
you combine this with a JUnit Rule for Kafka (e.g. [3]). In this case your
sources are not finite, so you will need to submit your job from a separate
thread and terminate it manually.

Cheers,

Konstantin

[1] https://github.com/knaufk/flink-testing-pyramid
[2]
https://github.com/knaufk/flink-testing-pyramid/blob/master/src/test/java/com/github/knaufk/testing/java/StreamingJobIntegrationTest.java
[3] https://github.com/charithe/kafka-junit



On Thu, Jun 13, 2019 at 10:24 AM <min....@ubs.com> wrote:

> Hi,
>
>
>
> I am new to Flink, at least to the  testing part.
>
>
>
> We need an end to end integration test for a flink job.
>
>
>
> Where can I find documentation for this?
>
>
>
> I am envisaging a test similar to that:
>
> 1)      Start a local job instance in an IDE or maven test
>
> 2)      Fire event jsons to the data source (i.e. a Kafka topic)
>
> 3)      Retrieve result jsons from the data sink (i.e. a Kafka topic or
> an elastic search index)
>
> 4)      Compared result jsons with the expected ones
>
>
>
> Since our Flink job is a streaming one, how can we tear the Flink job
> instance running in an IDE?
>
>
>
> Regards,
>
>
>
> Min
>
>
>


-- 

Konstantin Knauf | Solutions Architect

+49 160 91394525


Planned Absences: 20. - 21.06.2019, 10.08.2019 - 31.08.2019, 05.09. -
06.09.2010


--

Data Artisans GmbH | Invalidenstrasse 115, 10115 Berlin, Germany

--
Data Artisans GmbH
Registered at Amtsgericht Charlottenburg: HRB 158244 B
Managing Directors: Dr. Kostas Tzoumas, Dr. Stephan Ewen

Reply via email to