[ 
https://issues.apache.org/jira/browse/FLINK-6573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17443565#comment-17443565
 ] 

ZhuoYu Chen edited comment on FLINK-6573 at 11/15/21, 6:43 AM:
---------------------------------------------------------------

[~arvid] {color:#333333}Thank you for reminding me{color}.{color:#333333}I have 
completed most of the work, but there are some shortcomings :
1. {color}{color:#333333}When the data sink to mongodb is an array,mongodb will 
return an exception, and you need to transfer the array to List before sink 
data 
2.{color:#333333}Source {color}{color}{color:#333333}Specify the primary key 
3. {color}{color:#333333}The requirement is to organize the code to meet FLink 
PR specification{color}
!image-2021-11-15-14-41-07-514.png!


was (Author: monster#12):
[~arvid] {color:#333333}Thank you for reminding me{color}.{color:#333333}I have 
completed most of the work, but there are some shortcomings :1. 
{color}{color:#333333}When the data sink to mongodb is an array,mongodb will 
return an exception, and you need to transfer the array to List 2 before sink 
data.{color}{color:#333333}Specify the primary key 3. {color}{color:#333333}The 
requirement is to organize the code to meet FLink PR specification{color}
!image-2021-11-15-14-41-07-514.png!

> Flink MongoDB Connector
> -----------------------
>
>                 Key: FLINK-6573
>                 URL: https://issues.apache.org/jira/browse/FLINK-6573
>             Project: Flink
>          Issue Type: New Feature
>          Components: Connectors / Common
>    Affects Versions: 1.2.0
>         Environment: Linux Operating System, Mongo DB
>            Reporter: Nagamallikarjuna
>            Assignee: ZhuoYu Chen
>            Priority: Not a Priority
>              Labels: stale-assigned
>         Attachments: image-2021-11-15-14-41-07-514.png
>
>   Original Estimate: 672h
>  Remaining Estimate: 672h
>
> Hi Community,
> Currently we are using Flink in the current Project. We have huge amount of 
> data to process using Flink which resides in Mongo DB. We have a requirement 
> of parallel data connectivity in between Flink and Mongo DB for both 
> reads/writes. Currently we are planning to create this connector and 
> contribute to the Community.
> I will update the further details once I receive your feedback 
> Please let us know if you have any concerns.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to