Thanks for your sharing ~ That’s great !



Original Message
Sender:Wouter zorgdragerw.d.zorgdra...@tudelft.nl
Recipient:hai...@magicsoho.com
Cc:useru...@flink.apache.org
Date:Monday, Apr 29, 2019 20:05
Subject:Re: Read mongo datasource in Flink


For a framework I'm working on, we actually implemented a (basic) Mongo source 
[1]. It's written in Scala and uses Json4s [2] to parse the data into a case 
class. It uses a Mongo observer to iterate over a collection and emit it into a 
Flink context.


Cheers,
Wouter



[1]:https://github.com/codefeedr/codefeedr/blob/develop/codefeedr-plugins/codefeedr-mongodb/src/main/scala/org/codefeedr/plugins/mongodb/BaseMongoSource.scala
[2]:http://json4s.org/


Op ma 29 apr. 2019 om 13:57 schreef Flavio Pompermaier pomperma...@okkam.it:

I'm not aware of an official source/sink..if you want you could try to exploit 
the Mongo HadoopInputFormat as in [1].
The provided link use a pretty old version of Flink but it should not be a big 
problem to update the maven dependencies and the code to a newer version.


Best,
Flavio



[1]https://github.com/okkam-it/flink-mongodb-test


On Mon, Apr 29, 2019 at 6:15 AM Hai h...@magicsoho.com wrote:

Hi,


Can anyone give me a clue about how to read mongodb’s data as a batch/streaming 
datasource in Flink? I don’t find the mongodb connector in recent release 
version .


Many thanks

Reply via email to