[ https://issues.apache.org/jira/browse/SPARK-51706?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17940549#comment-17940549 ]
Subham Singhal commented on SPARK-51706: ---------------------------------------- i think adding writeStream would work. resolving this issue. > Ratelimit kafka publish in spark > -------------------------------- > > Key: SPARK-51706 > URL: https://issues.apache.org/jira/browse/SPARK-51706 > Project: Spark > Issue Type: Improvement > Components: Structured Streaming > Affects Versions: 3.5.5 > Reporter: Subham Singhal > Priority: Minor > > Currently there is no rate limiting applied on kafka publish. While reading > from streaming source we can definitely control fetch bytes and write smaller > batches to kafka but if we are reading in batch and batch size is huge we > will end up publishing too many messages to kafka at once and this may cause > issues in kafka. It would be better to have a rate limiting feature on kafka > publisher. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org