Thanks Shixiong, I'll give it a try and report back
Cheers
On 26 Jan 2016 6:10 p.m., "Shixiong(Ryan) Zhu"
wrote:
> The number of concurrent Streaming job is controlled by
> "spark.streaming.concurrentJobs". It's 1 by default. However, you need to
> keep in mind that setting it to a bigger number
The number of concurrent Streaming job is controlled by
"spark.streaming.concurrentJobs". It's 1 by default. However, you need to
keep in mind that setting it to a bigger number will allow jobs of several
batches running at the same time. It's hard to predicate the behavior and
sometimes will surpr
Hi,
I'm trying to get *FAIR *scheduling to work in a spark streaming app
(1.6.0).
I've found a previous mailing list where it is indicated to do:
dstream.foreachRDD { rdd =>
rdd.sparkContext.setLocalProperty("spark.scheduler.pool", "pool1") // set
the pool rdd.count() // or whatever job }
This