It will add scheduling delay for the new batch. The new batch data will be
processed after finish up the previous batch, when the time is too high,
sometimes it will throw fetch failures as the batch data could get removed
from memory.

Thanks
Best Regards

On Wed, Apr 1, 2015 at 11:35 AM, <luohui20...@sina.com> wrote:

> hi guys:
>
>           I got a question when reading
> http://spark.apache.org/docs/latest/streaming-programming-guide.html#setting-the-right-batch-interval
> .
>
>
>
>          What will happen to the streaming data if the batch processing
> time is bigger than the batch interval? Will the next batch data be dalayed
> to process or the unfinished processing job to be discarded?
>
>
>
>         thanks for any ideas shared?
>
> --------------------------------
>
> Thanks&amp;Best regards!
> 罗辉 San.Luo
>

Reply via email to