Depending on source. For example, if source is Kafka then you can write 4 streaming consumers. On 15 Sep 2016 20:11, "Udbhav Agarwal" <udbhav.agar...@syncoms.com> wrote:
> Hi All, > > I have a scenario where I want to process a message in various ways in > parallel. For instance a message is coming inside spark stream(DStream) and > I want to send this message to 4 different tasks in parallel. I want these > 4 different tasks to be separate streams in the original spark stream and > are always active and waiting for input. Can I implement such a process > with spark streaming ? How ? > > Thanks in advance. > > > > *Thanks,* > > *Udbhav Agarwal* > > > > >