Hi All,

I am using spark in a yarn cluster mode.
When I run a yarn application it creates multiple executors on the hadoop
datanodes for processing.

Is it possible for me to create a local spark context (master=local) on
these executors to be able to get a spark context ?

Theoretically since each executor is a java process this should be doable
isn't it ?

Thanks,
Shashank

Reply via email to