[
https://issues.apache.org/jira/browse/FLINK-38782?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18042999#comment-18042999
]
Xiqian Yu commented on FLINK-38782:
-----------------------------------
Hi [~Joekwal], thanks for reporting this issue. Could you please share your
connector configuration list? Also, did this exception occur in snapshot stage
or incremental streaming stage? How large is this dirty document in MongoDB?
> Mongodb CDC need a config to filter large size column
> -----------------------------------------------------
>
> Key: FLINK-38782
> URL: https://issues.apache.org/jira/browse/FLINK-38782
> Project: Flink
> Issue Type: Improvement
> Components: Flink CDC
> Affects Versions: cdc-3.5.0
> Reporter: Joekwal
> Priority: Major
> Fix For: cdc-3.5.0
>
>
> An error occured:
> {color:#ff3333}com.mongodb.MongoQueryException: Query failed with error code
> 10334 and error message 'Executor error during getMore :: caused by ::
> BSONObj size: 27661090 (0x1A61322) is invalid. Size must be between 0 and
> 16793600(16MB) First element: _id: { _data: "xxx{color}" }' on server
> I'v solved the source problem in mongo, but maybe the developers need a
> config to skip the large size columns.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)