Hi,
I tried the Spark(1.0.0)+Play(2.3.3) example from the Knoldus blog -
http://blog.knoldus.com/2014/06/18/play-with-spark-building-apache-spark-with-play-framework/
and
it worked for me. The project is here -
https://github.com/knoldus/Play-Spark-Scala
Regards,
Manu
On Sat, Aug 16, 2014 at 11
Hi
I am trying to connect to Spark from Play framework. Getting the following
Akka error...
[ERROR] [08/16/2014 17:12:05.249]
[spark-akka.actor.default-dispatcher-3] [ActorSystem(spark)] Uncaught
fatal error from thread [spark-akka.actor.default-dispatcher-3]
shutting down ActorSystem [spark]
jav
http://spark.apache.org/docs/latest/running-on-yarn.html
Spark just a Yarn application
> 在 2014年8月14日,11:12,牛兆捷 写道:
>
> Dear all:
>
> Does spark can acquire resources from and give back resources to
> YARN dynamically ?
>
>
> --
> *Regards,*
> *Zhaojie*
---
Hi Xiangrui,
I actually tried branch-1.1 and master and it resulted in the job being
stuck at the TaskSetManager:
14/08/16 06:55:48 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0
with 2 tasks
14/08/16 06:55:48 INFO scheduler.TaskSetManager: Starting task 1.0:0 as TID
2 on executor 8: ip-10-2
Hi Stephen,
Have you tried the --jars option (with jars separated by commas)? It
should make the given jars available both to the driver and the executors.
I believe one caveat currently is that if you give it a folder it won't
pick up all the jars inside.
-Sandy
On Fri, Aug 15, 2014 at 4:07