http://spark.apache.org/docs/latest/configuration.html

"This rate is upper bounded by the values
spark.streaming.receiver.maxRate and
spark.streaming.kafka.maxRatePerPartition if they are set (see
below)."

On Tue, Oct 11, 2016 at 10:57 AM, Samy Dindane <s...@dindane.com> wrote:
> Hi,
>
> Is it possible to limit the size of the batches returned by the Kafka
> consumer for Spark Streaming?
> I am asking because the first batch I get has hundred of millions of records
> and it takes ages to process and checkpoint them.
>
> Thank you.
>
> Samy
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to