Totally agree, also there is a class 'SparkSubmit' you can call directly to
replace shellscript
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-as-web-app-backend-tp8163p8248.html
Sent from the Apache Spark User List mailing list archive at Nabbl
Yeah I agree with Koert, it would be the lightest solution. I have
used it quite successfully and it just works.
There is not much spark specifics here, you can follow this example
https://github.com/jacobus/s4 on how to build your spray service.
Then the easy solution would be to have a SparkCont
Hi all,
Thank you for the reply. Is there any example of spark running in client
mode with spray ? I think, I will choose this approach.
On Tue, Jun 24, 2014 at 4:55 PM, Koert Kuipers wrote:
> run your spark app in client mode together with a spray rest service, that
> the front end can talk t
run your spark app in client mode together with a spray rest service, that
the front end can talk to
On Tue, Jun 24, 2014 at 3:12 AM, Jaonary Rabarisoa
wrote:
> Hi all,
>
> So far, I run my spark jobs with spark-shell or spark-submit command. I'd
> like to go further and I wonder how to use spa
Hi,
You could use sock.js / websockets on the front end, so you can notify the
user when the job is finished. You can regularly poll the URL of the job to
check its status from.your node.js app - at the moment I do not know an
out of the box solution.
Nicer would be if your job sends a message v