My bad,

Adding  export -DZEPPELIN_INTP_JAVA_OPTS="-Dspark.executor.instances=100" this 
to zeppelin-env.sh did work for me.


From: Sambit Tripathy (RBEI/EDS1) [mailto:sambit.tripa...@in.bosch.com]
Sent: Monday, July 06, 2015 2:48 PM
To: users@zeppelin.incubator.apache.org
Subject: Spark jobs on Yarn is using 3 virtual cores

Hi,

I upgraded to Spark 1.3.1 and after running the job on the interpreter I see 
only 3 virtual cores are used, even after specifying the number of cores in 
zeppelin-env.sh as “export ZEPPELIN_JAVA_OPTS “

Is there a place where I can set the number correctly? I am using Spark on Yarn.


Thanks in advance for your pointers.


Regards,
Sambit.

Reply via email to