Yes your interpretation is correct. For every batch, there will be two
jobs, and they will be run one after another.

TD


On Fri, Jul 18, 2014 at 3:06 AM, Haopu Wang <hw...@qilinsoft.com> wrote:

>  By looking at the code of JobScheduler, I find a parameter of below:
>
>
>
>   *private* val numConcurrentJobs = ssc.conf.getInt("
> spark.streaming.concurrentJobs", 1)
>
>   *private* val jobExecutor = Executors.newFixedThreadPool(
> numConcurrentJobs)
>
>
>
> Does that mean each App can have only one active stage?
>
>
>
> In my psydo-code below:
>
>
>
>          S1 = viewDStream.forEach( collect() )..
>
>          S2 = viewDStream.forEach( collect() )..
>
>
>
> There should be two “collect()” jobs for each batch interval, right? Are
> they running in parallel?
>
>
>
> Thank you!
>

Reply via email to