[ https://issues.apache.org/jira/browse/FLINK-6573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17443278#comment-17443278 ]
Arvid Heise commented on FLINK-6573: ------------------------------------ I have assigned the ticket to you. Please note that we are currently transitioning connectors from Flink's main repo into https://github.com/apache/flink-connectors . I can't give you an ETA when that repo is ready but we can use your contribution as the lackmus test. The advantage of this repo is that your Mongo connector would be available as soon as it's done and users don't need to wait for 1.15. Please also note that we only accept new connectors written against the unified [source|https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/dev/datastream/sources/] and [sink|https://cwiki.apache.org/confluence/display/FLINK/FLIP-143%3A+Unified+Sink+API] interfaces. > Flink MongoDB Connector > ----------------------- > > Key: FLINK-6573 > URL: https://issues.apache.org/jira/browse/FLINK-6573 > Project: Flink > Issue Type: New Feature > Components: Connectors / Common > Affects Versions: 1.2.0 > Environment: Linux Operating System, Mongo DB > Reporter: Nagamallikarjuna > Assignee: ZhuoYu Chen > Priority: Not a Priority > Labels: stale-assigned > Original Estimate: 672h > Remaining Estimate: 672h > > Hi Community, > Currently we are using Flink in the current Project. We have huge amount of > data to process using Flink which resides in Mongo DB. We have a requirement > of parallel data connectivity in between Flink and Mongo DB for both > reads/writes. Currently we are planning to create this connector and > contribute to the Community. > I will update the further details once I receive your feedback > Please let us know if you have any concerns. -- This message was sent by Atlassian Jira (v8.20.1#820001)