It will be a single job running at a time by default (you can also
configure the spark.streaming.concurrentJobs to run jobs parallel which is
not recommended to put in production).

Now, your batch duration being 1 sec and processing time being 2 minutes,
if you are using a receiver based streaming then ideally those receivers
will keep on receiving data while the job is running (which will accumulate
in memory if you set StorageLevel as MEMORY_ONLY and end up in block not
found exceptions as spark drops some blocks which are yet to process to
accumulate new blocks). If you are using a non-receiver based approach, you
will not have this problem of dropping blocks.

Ideally, if your data is small and you have enough memory to hold your data
then it will run smoothly without any issues.

Thanks
Best Regards

On Tue, May 19, 2015 at 1:23 PM, Shushant Arora <shushantaror...@gmail.com>
wrote:

> What happnes if in a streaming application one job is not yet finished and
> stream interval reaches. Does it starts next job or wait for first to
> finish and rest jobs will keep on accumulating in queue.
>
>
> Say I have a streaming application with stream interval of 1 sec, but my
> job takes 2 min to process 1 sec stream , what will happen ?  At any time
> there will be only one job running or multiple ?
>
>

Reply via email to