Why not build the project and submit the build jar with Spark submit?
If you want to run it within eclipse, then all you have to do is, create a
SparkContext pointing to your cluster, do a
sc.addJar("/path/to/your/project/jar") and then you can hit the run button
to run the job (note that network shouldn't be a problem between your
driver and the cluster)
Thanks
Best Regards
On Mon, Apr 20, 2015 at 12:14 PM, sandeep vura <[email protected]>
wrote:
> Hi Sparkers,
>
> I have written a code in python in eclipse now that code should execute in
> spark cluster like mapreduce jobs in hadoop cluster.Can anyone please help
> me with instructions.
>
> Regards,
> Sandeep.v
>