-Naveen
>
>
>
> From: Akhil Das [mailto:ak...@sigmoidanalytics.com
> <mailto:ak...@sigmoidanalytics.com>]
> Sent: Wednesday, November 26, 2014 10:03 PM
> To: Naveen Kumar Pokala
> Cc: user@spark.apache.org <mailto:user@spark.apache.org>
> Subj
eed from the code and how to execute from
> spark submit from windows machine.
>
>
>
> Please provide me sample code if you have any.
>
>
>
> -Naveen
>
>
>
> *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
> *Sent:* Wednesday, November 26, 2014
how to execute from spark
submit from windows machine.
Please provide me sample code if you have any.
-Naveen
From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Wednesday, November 26, 2014 10:03 PM
To: Naveen Kumar Pokala
Cc: user@spark.apache.org
Subject: Re: Spark Job submit
How about
I think that actually would not work - yarn-cluster mode expects a specific
deployment path that uses SparkSubmit. Setting master as yarn-client should
work.
-Sandy
On Wed, Nov 26, 2014 at 8:32 AM, Akhil Das
wrote:
> How about?
>
> - Create a SparkContext
> - setMaster as *yarn-cluster*
> - Cre
How about?
- Create a SparkContext
- setMaster as *yarn-cluster*
- Create a JavaSparkContext with the above SparkContext
And that will submit it to the yarn cluster.
Thanks
Best Regards
On Wed, Nov 26, 2014 at 4:20 PM, Naveen Kumar Pokala <
npok...@spcapitaliq.com> wrote:
> Hi.
>
>
>
> Is ther