You just need to specify your master as local and run the main clas that
created the sparkcontext object in eclipse.

On Mon, Apr 20, 2015 at 12:18 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Why not build the project and submit the build jar with Spark submit?
>
> If you want to run it within eclipse, then all you have to do is, create a
> SparkContext pointing to your cluster, do a
> sc.addJar("/path/to/your/project/jar") and then you can hit the run button
> to run the job (note that network shouldn't be a problem between your
> driver and the cluster)
>
> Thanks
> Best Regards
>
> On Mon, Apr 20, 2015 at 12:14 PM, sandeep vura <sandeepv...@gmail.com>
> wrote:
>
>> Hi Sparkers,
>>
>> I have written a code in python in eclipse now that code should execute
>> in spark cluster like mapreduce jobs in hadoop cluster.Can anyone please
>> help me with instructions.
>>
>> Regards,
>> Sandeep.v
>>
>
>

Reply via email to