Hi,

I have a YARN hadoop setup of 8 nodes (7 datanodes, 1 namenode and
resourcemaneger). I have Spark setup only on the namenode/resource manager.

Do I need to have Spark installed on the datanodes?

I asked because I'm getting below error when I run a Spark job through
spark-submit:

Error: Could not find or load main class
org.apache.spark.deploy.yarn.ExecutorLauncher

I appreciate your help.

Many thanks

Reply via email to