sc.set("spark.streaming.concurrentJobs","2")
>>
>> Refer to TD's answer here
>> <http://stackoverflow.com/questions/23528006/how-jobs-are-assigned-to-executors-in-spark-streaming#answers-header>
>> for more information.
>>
>>
>> Thanks
&g
Hi,
I have use case wherein I have to join multiple kafka topics in parallel.
So if there are 2n topics there is a one to one mapping of topics which
needs to be joined.
val arr= ...
for(condition) {
val dStream1 = KafkaUtils.createDirectStream[String, String,
StringDecoder, St