Hi Aissa,

Flink supports to read from multiple sources in one job. You have to call
multiple times `StreamExecutionEnvironment.addSource()` with the respective
`SourceFunction`. Flink does not come with a ready-made MongoDB connector.
However, there is a project which tried to implement a MongoDB connector
[1]. You might be able to take this work and convert it into a
`SourceFunction`.

[1]
https://ci.apache.org/projects/flink/flink-docs-stable/dev/batch/connectors.html#access-mongodb

Cheers,
Till

On Wed, May 27, 2020 at 11:17 AM Aissa Elaffani <aissaelaff...@gmail.com>
wrote:

> Hello everyone,
> I hope you all doing well.I am reading from a Kafka topic some real-time
> messages produced by some sensors, and in order to do some aggregations, I
> need to enrich the stream with other data that are stocked in a mongoDB.
> So, I want to know if it is possible to work with two sources in one job?
> if Yes, How to do so ?
> Best,
> Aissa
>

Reply via email to