You can try spark on Mesos or Yarn since they have lot more support for
scheduling and all

Thanks
Best Regards

On Thu, Sep 25, 2014 at 4:50 AM, Subacini B <subac...@gmail.com> wrote:

> hi All,
>
> How to run concurrently multiple requests on same cluster.
>
> I have a program using *spark streaming context *which reads* streaming
> data* and writes it to HBase. It works fine, the problem is when multiple
> requests are submitted to cluster, only first request is processed as the
> entire cluster is used for this request. Rest of the requests are in
> waiting mode.
>
> i have set  spark.cores.max to 2 or less, so that it can process another
> request,but if there is only one request cluster is not utilized properly.
>
> Is there any way, that spark cluster can process streaming request
> concurrently at the same time effectively utitlizing cluster, something
> like sharkserver
>
> Thanks
> Subacini
>

Reply via email to