>>
>>
>> Thanks,
>>
>> Udbhav
>>
>> *From:* ayan guha [mailto:guha.a...@gmail.com]
>> *Sent:* Friday, September 16, 2016 3:01 AM
>> *To:* Udbhav Agarwal
>> *Cc:* user
>> *Subject:* RE: Spark processing Multiple Streams from a sing
> source like kafka etc.? Source cannot be some rdd in spark or some external
> file ?
>
>
>
> Thanks,
>
> Udbhav
>
> *From:* ayan guha [mailto:guha.a...@gmail.com]
> *Sent:* Friday, September 16, 2016 3:01 AM
> *To:* Udbhav Agarwal
> *Cc:* user
> *Subject
Agarwal
Cc: user
Subject: RE: Spark processing Multiple Streams from a single stream
You may consider writing back to Kafka from main stream and then have
downstream consumers.
This will keep things modular and independent.
On 15 Sep 2016 23:29, "Udbhav Agarwal"
mailto:u
guha [mailto:guha.a...@gmail.com]
> *Sent:* Thursday, September 15, 2016 6:43 PM
> *To:* Udbhav Agarwal
> *Cc:* user
> *Subject:* Re: Spark processing Multiple Streams from a single stream
>
>
>
> Depending on source. For example, if source is Kafka then you can write 4
> str
?
Thanks,
Udbhav
From: ayan guha [mailto:guha.a...@gmail.com]
Sent: Thursday, September 15, 2016 6:43 PM
To: Udbhav Agarwal
Cc: user
Subject: Re: Spark processing Multiple Streams from a single stream
Depending on source. For example, if source is Kafka then you can write 4
streaming consumers
Depending on source. For example, if source is Kafka then you can write 4
streaming consumers.
On 15 Sep 2016 20:11, "Udbhav Agarwal" wrote:
> Hi All,
>
> I have a scenario where I want to process a message in various ways in
> parallel. For instance a message is coming inside spark stream(DStrea