If you have enough cores/resources, run them separately depending on your
use case.

On Thursday 15 December 2016, Divya Gehlot <divya.htco...@gmail.com> wrote:

> It depends on the use case ...
> Spark always depends on the resource availability .
> As long as you have resource to acoomodate ,can run as many spark/spark
> streaming  application.
>
>
> Thanks,
> Divya
>
> On 15 December 2016 at 08:42, shyla deshpande <deshpandesh...@gmail.com
> <javascript:_e(%7B%7D,'cvml','deshpandesh...@gmail.com');>> wrote:
>
>> How many Spark streaming applications can be run at a time on a Spark
>> cluster?
>>
>> Is it better to have 1 spark streaming application to consume all the
>> Kafka topics or have multiple streaming applications when possible to keep
>> it simple?
>>
>> Thanks
>>
>>
>

Reply via email to