Hi Niels,

Sorry for the late response.

you can launch a Kafka Broker within a JVM and use it for testing purposes.
Flink's Kafka connector is using that a lot for integration tests. Here is
the code starting the Kafka server:
https://github.com/apache/flink/blob/770f2f83a81b2810aff171b2f56390ef686f725a/flink-streaming-connectors/flink-connector-kafka-0.8/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaTestEnvironmentImpl.java#L358


In our tests we also start a Flink mini cluster, to submit jobs to.
Here is an example:
https://github.com/apache/flink/blob/f46ca39188dce1764ee6615eb6697588fdc04a2a/flink-streaming-connectors/flink-connector-kafka-base/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaConsumerTestBase.java#L1408


I hope you can find the right code lines to copy for your purposes.

Regards,
Robert


On Fri, Oct 21, 2016 at 4:00 PM, Niels Basjes <ni...@basjes.nl> wrote:

> Hi,
>
> In addition to having unit tests for the individual components (map,
> flatmap, reduce, etc) of my application I would like to write unit tests
> for the entire flow of my Flink application.
>
> My application reads from Kafka, does various processing and writes out
> put to both kafka and files.
>
> This means I need a controlled (mock) Kafka broker where I can insert a
> specific sequence of messages, my application that reads from that and then
> writes the output somewhere else.
>
> What is the recommended way of doing this?
> What tools/example projects are available for building such tests?
>
> Thanks.
>
> --
> Best regards / Met vriendelijke groeten,
>
> Niels Basjes
>

Reply via email to