On Fri, Apr 24, 2015 at 4:56 PM, Laeeq Ahmed wrote:
> Thanks Dragos,
>
> Earlier test shows spark.streaming.concurrentJobs has worked.
>
Glad to hear it worked!
iulian
>
> Regards,
> Laeeq
>
>
>
>
> On Friday, April 24, 2015 11:58 AM, Iulian Dragoș <
> iulian.dra...@typesafe.com> wrote:
>
>
It looks like you’re creating 23 actions in your job (one per DStream). As
far as I know by default Spark Streaming executes only one job at a time.
So your 23 actions are executed one after the other. Try setting
spark.streaming.concurrentJobs to something higher than one.
iulian
On Fri, Apr 2
You can probably try the Low Level Consumer from spark-packages (
http://spark-packages.org/package/dibbhatt/kafka-spark-consumer) .
How many partitions are there for your topics ? Let say you have 10 topics
, and each having 3 partition , ideally you can create max 30 parallel
Receiver and 30 str
Hi,
Any comments please.
Regards,Laeeq
On Friday, April 17, 2015 11:37 AM, Laeeq Ahmed
wrote:
Hi,
I am working with multiple Kafka streams (23 streams) and currently I am
processing them separately. I receive one stream from each topic. I have the
following questions.
1. Spark