[
https://issues.apache.org/jira/browse/FLINK-38782?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18043026#comment-18043026
]
Joekwal commented on FLINK-38782:
---------------------------------
Well, it occured in incremental streaming stage. The document is 27mb of size
have mentioned in exception.
Config:
'hosts' = '${ext_source.mongodb.aiagent.hosts}',
'collection' = 'aiagent.agent_runs_[0-9]\{4}_[0-9]\{2}',
'password' = '${ext_source.mongodb.aiagent.password}',
'database' = 'aiagent',
'connector' = 'mongodb-cdc',
'username' = '${ext_source.mongodb.aiagent.username}',
'connection.options' = '${ext_source.mongodb.aiagent.connection_options}'
> Mongodb CDC need a config to filter large size column
> -----------------------------------------------------
>
> Key: FLINK-38782
> URL: https://issues.apache.org/jira/browse/FLINK-38782
> Project: Flink
> Issue Type: Improvement
> Components: Flink CDC
> Affects Versions: cdc-3.5.0
> Reporter: Joekwal
> Priority: Major
> Fix For: cdc-3.5.0
>
>
> An error occured:
> {color:#ff3333}com.mongodb.MongoQueryException: Query failed with error code
> 10334 and error message 'Executor error during getMore :: caused by ::
> BSONObj size: 27661090 (0x1A61322) is invalid. Size must be between 0 and
> 16793600(16MB) First element: _id: { _data: "xxx{color}" }' on server
> I'v solved the source problem in mongo, but maybe the developers need a
> config to skip the large size columns.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)