;
>
>
> -- 原始邮件 --
> *发件人:* "Yash Sharma";;
> *发送时间:* 2016年6月22日(星期三) 下午2:46
> *收件人:* "另一片天"<958943...@qq.com>;
> *抄送:* "Saisai Shao"; "user";
>
> *主题:* Re: Could not find or load main class
> o
> *收件人:* "另一片天"<958943...@qq.com>;
> *抄送:* "Saisai Shao"; "user";
>
> *主题:* Re: Could not find or load main class
> org.apache.spark.deploy.yarn.ExecutorLauncher
>
> Try with : --master yarn-cluster
>
> On Wed, Jun 22, 2016 at 4:30 PM, 另一
--
>> *发件人:* "Yash Sharma";;
>> *发送时间:* 2016年6月22日(星期三) 下午2:28
>> *收件人:* "另一片天"<958943...@qq.com>;
>> *抄送:* "Saisai Shao"; "user";
>>
>> *主题:* Re: Could not find or load main class
>> org.apache.spark.deploy.yarn.Exec
@qq.com> wrote:
>
>> Is it able to run on local mode ?
>>
>> what mean?? standalone mode ?
>>
>>
>> -- 原始邮件 --
>> *发件人:* "Yash Sharma";;
>> *发送时间:* 2016年6月22日(星期三) 下午2:18
>> *收件人:* "Saisai S
ot;Saisai Shao";
> *抄送:* "另一片天"<958943...@qq.com>; "user";
> *主题:* Re: Could not find or load main class
> org.apache.spark.deploy.yarn.ExecutorLauncher
>
> Try providing the jar with the hdfs prefix. Its probably just because its
> not able
aisai Shao";
> *抄送:* "另一片天"<958943...@qq.com>; "user";
> *主题:* Re: Could not find or load main class
> org.apache.spark.deploy.yarn.ExecutorLauncher
>
> Try providing the jar with the hdfs prefix. Its probably just because its
> not able to find the ja
gt; at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> get error at once
>> ------ 原始邮件 --
>> *发件人:* "Yash Sharma";;
>> *发送时间:* 2016年6月22日(星期三) 下午2:04
>>
> -- 原始邮件 --
> *发件人:* "Yash Sharma";;
> *发送时间:* 2016年6月22日(星期三) 下午2:04
> *收件人:* "另一片天"<958943...@qq.com>;
> *抄送:* "user";
> *主题:* Re: Could not find or load main class
> org.apache.spark.deploy.yarn.ExecutorLau
Make sure you built spark with -Pyarn, and check whether you have
class ExecutorLauncher in your spark assembly jar.
On Wed, Jun 22, 2016 at 2:04 PM, Yash Sharma wrote:
> How about supplying the jar directly in spark submit -
>
> ./bin/spark-submit \
>> --class org.apache.spark.examples.SparkPi
How about supplying the jar directly in spark submit -
./bin/spark-submit \
> --class org.apache.spark.examples.SparkPi \
> --master yarn-client \
> --driver-memory 512m \
> --num-executors 2 \
> --executor-memory 512m \
> --executor-cores 2 \
> /user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6
Thank you. That seems to resolve it.
On Fri, Nov 6, 2015 at 11:46 PM, Ted Yu wrote:
> You mentioned resourcemanager but not nodemanagers.
>
> I think you need to install Spark on nodes running nodemanagers.
>
> Cheers
>
> On Fri, Nov 6, 2015 at 1:32 PM, Kayode Odeyemi wrote:
>
>> Hi,
>>
>> I ha
You mentioned resourcemanager but not nodemanagers.
I think you need to install Spark on nodes running nodemanagers.
Cheers
On Fri, Nov 6, 2015 at 1:32 PM, Kayode Odeyemi wrote:
> Hi,
>
> I have a YARN hadoop setup of 8 nodes (7 datanodes, 1 namenode and
> resourcemaneger). I have Spark setup
12 matches
Mail list logo