As far as I know in other to deploy and execute jobs in EC2 you need to
assembly you project, copy your jar into the cluster, log into using ssh
and submit the job.

To avoid having to do this I've been prototyping an sbt plugin(1) that
allows to create and send Spark jobs to an Amazon EC2 cluster directly from
your local machine using sbt.

It's a simple plugin that actually rely on spark-ec2 and spark-submit, but
 I'd like to have feedback and see if this plugin makes any sense before
going ahead with the final impl or if there is any other easy way to do so.

(1) https://github.com/felixgborrego/sbt-spark-ec2-plugin

Thanks,

Reply via email to