RE: Spark processing Multiple Streams from a single stream

2016-09-16 Thread ayan guha
>> >> >> Thanks, >> >> Udbhav >> >> *From:* ayan guha [mailto:guha.a...@gmail.com] >> *Sent:* Friday, September 16, 2016 3:01 AM >> *To:* Udbhav Agarwal >> *Cc:* user >> *Subject:* RE: Spark processing Multiple Streams from a sing

RE: Spark processing Multiple Streams from a single stream

2016-09-16 Thread ayan guha
> source like kafka etc.? Source cannot be some rdd in spark or some external > file ? > > > > Thanks, > > Udbhav > > *From:* ayan guha [mailto:guha.a...@gmail.com] > *Sent:* Friday, September 16, 2016 3:01 AM > *To:* Udbhav Agarwal > *Cc:* user > *Subject

RE: Spark processing Multiple Streams from a single stream

2016-09-15 Thread Udbhav Agarwal
Agarwal Cc: user Subject: RE: Spark processing Multiple Streams from a single stream You may consider writing back to Kafka from main stream and then have downstream consumers. This will keep things modular and independent. On 15 Sep 2016 23:29, "Udbhav Agarwal" mailto:u

RE: Spark processing Multiple Streams from a single stream

2016-09-15 Thread ayan guha
guha [mailto:guha.a...@gmail.com] > *Sent:* Thursday, September 15, 2016 6:43 PM > *To:* Udbhav Agarwal > *Cc:* user > *Subject:* Re: Spark processing Multiple Streams from a single stream > > > > Depending on source. For example, if source is Kafka then you can write 4 > str

RE: Spark processing Multiple Streams from a single stream

2016-09-15 Thread Udbhav Agarwal
? Thanks, Udbhav From: ayan guha [mailto:guha.a...@gmail.com] Sent: Thursday, September 15, 2016 6:43 PM To: Udbhav Agarwal Cc: user Subject: Re: Spark processing Multiple Streams from a single stream Depending on source. For example, if source is Kafka then you can write 4 streaming consumers

Re: Spark processing Multiple Streams from a single stream

2016-09-15 Thread ayan guha
Depending on source. For example, if source is Kafka then you can write 4 streaming consumers. On 15 Sep 2016 20:11, "Udbhav Agarwal" wrote: > Hi All, > > I have a scenario where I want to process a message in various ways in > parallel. For instance a message is coming inside spark stream(DStrea