You mentioned resourcemanager but not nodemanagers.

I think you need to install Spark on nodes running nodemanagers.

Cheers

On Fri, Nov 6, 2015 at 1:32 PM, Kayode Odeyemi <drey...@gmail.com> wrote:

> Hi,
>
> I have a YARN hadoop setup of 8 nodes (7 datanodes, 1 namenode and
> resourcemaneger). I have Spark setup only on the namenode/resource manager.
>
> Do I need to have Spark installed on the datanodes?
>
> I asked because I'm getting below error when I run a Spark job through
> spark-submit:
>
> Error: Could not find or load main class 
> org.apache.spark.deploy.yarn.ExecutorLauncher
>
> I appreciate your help.
>
> Many thanks
>
>

Reply via email to