[ 
https://issues.apache.org/jira/browse/SPARK-51706?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Subham Singhal reopened SPARK-51706:
------------------------------------

not sure if writeStream would work to limit number of records written per sec. 
wanted to understand if this is right way for flow control

> Ratelimit kafka publish in spark
> --------------------------------
>
>                 Key: SPARK-51706
>                 URL: https://issues.apache.org/jira/browse/SPARK-51706
>             Project: Spark
>          Issue Type: Improvement
>          Components: Structured Streaming
>    Affects Versions: 3.5.5
>            Reporter: Subham Singhal
>            Priority: Minor
>
> Currently there is no rate limiting applied on kafka publish. While reading 
> from streaming source we can definitely control fetch bytes and write smaller 
> batches to kafka but if we are reading in batch and batch size is huge we 
> will end up publishing too many messages to kafka at once and this may cause 
> issues in kafka. It would be better to have a rate limiting feature on kafka 
> publisher.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to