What is your use case? I am sure there must be a better way to solve it....

On Wed, Mar 22, 2017 at 9:34 AM, Shashank Mandil <mandil.shash...@gmail.com>
wrote:

> Hi All,
>
> I am using spark in a yarn cluster mode.
> When I run a yarn application it creates multiple executors on the hadoop
> datanodes for processing.
>
> Is it possible for me to create a local spark context (master=local) on
> these executors to be able to get a spark context ?
>
> Theoretically since each executor is a java process this should be doable
> isn't it ?
>
> Thanks,
> Shashank
>



-- 
Best Regards,
Ayan Guha

Reply via email to