Simply configure spark.cores.max variable in your application in spark
context.

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Mon, May 19, 2014 at 11:13 PM, anishs...@yahoo.co.in <
anishs...@yahoo.co.in> wrote:

> Hi All
>
> I am new to Spark, I was trying to use Spark Streaming and Shark at the
> same time.
>
> I was recieiving messages from Kafka and pushing them to HDFS after minor
> processing.
>
> It was workin fine, but it was taking all the CPUs and at the same time on
> other terminal i tried to access shark but it kept on waiting until i
> stopped listener.
>
> on the web console it was showing all 6 CPUs were taken by Spark Streamin
> Listener and Shark had zero CPU.
>
> (I have 3 node test cluster)
>
> Please suggest
>
> Thanks & regards
> --
> Anish Sneh
> http://in.linkedin.com/in/anishsneh
>

Reply via email to