To: Mohammed Guller
Cc: US Office Admin; Surendranauth Hiraman; Daniel Siegmann;
user@spark.apache.org
Subject: Re: Play framework
In our case, Play libraries are not required to run spark jobs. Hence they are
available only on master and play runs as a regular scala application. I can't
16, 2014 7:05 PM
> *To:* Mohammed Guller; Surendranauth Hiraman
> *Cc:* Daniel Siegmann; user@spark.apache.org
> *Subject:* Re: Play framework
>
>
>
> The remaining dependencies (Spark libraries) are available for the context
> from the sparkhome. I have installed spark such t
; Surendranauth Hiraman
Cc: Daniel Siegmann; user@spark.apache.org
Subject: Re: Play framework
The remaining dependencies (Spark libraries) are available for the context from
the sparkhome. I have installed spark such that all the slaves to have same
sparkhome. Code looks like this.
val conf = new
ent: Thursday, October 16, 2014 4:00 PM
To: US Office Admin; Surendranauth Hiraman
Cc: Daniel Siegmann; user@spark.apache.org
Subject: RE: Play framework
Thanks, Suren and Raju.
Raju – if I remember correctly, Play package command just creates a jar for
your app. That jar file will not include
To: Mohammed Guller
Cc: US Office Admin; Surendranauth Hiraman; Daniel Siegmann;
user@spark.apache.org
Subject: Re: Play framework
Hi,
Below is the link for a simple Play + SparkSQL example -
http://blog.knoldus.com/2014/07/14/play-with-spark-building-apache-spark-with-play-framework-part-3
Hi,
Below is the link for a simple Play + SparkSQL example -
http://blog.knoldus.com/2014/07/14/play-with-spark-building-apache-spark-with-play-framework-part-3/
https://github.com/knoldus/Play-Spark-Scala
Manu
On Thu, Oct 16, 2014 at 1:00 PM, Mohammed Guller
wrote:
> Thanks, Suren
: Play framework
We integrated Spark into Play and use SparkSQL extensively on an ec2 spark
cluster on Hadoop hdfs 1.2.1 and tachyon 0.4.
Step 1: Create a play scala application as usual
Step 2. In Build.sbt put all your spark dependencies. What works for us is Play
2.2.3 Scala 2.10.4 Spark 1.1
From: Surendranauth Hiraman
Sent: Thursday, October 16, 2014 12:42 PM
To: Mohammed Guller
Cc: Daniel Siegmann; user@spark.apache.org
Subject: Re: Play framework
Mohammed,
Jumping in for Daniel, we actually address the configuration issue by pulling
values from environment variables
n the build.scala file for my play project is incorrect.
>
>
>
> Mohammed
>
>
>
> *From:* Daniel Siegmann [mailto:daniel.siegm...@velos.io]
> *Sent:* Thursday, October 16, 2014 7:15 AM
> *To:* Mohammed Guller
> *Cc:* user@spark.apache.org
> *Subject:* Re: Play frame
l cannot get rid of Akka related exceptions. I suspect that
> the settings in the build.scala file for my play project is incorrect.
>
>
>
> Mohammed
>
>
>
> *From:* Daniel Siegmann [mailto:daniel.siegm...@velos.io]
> *Sent:* Thursday, October 16, 2014 7:15 AM
&
@spark.apache.org
Subject: Re: Play framework
We execute Spark jobs from a Play application but we don't use spark-submit. I
don't know if you really want to use spark-submit, but if not you can just
create a SparkContext programmatically in your app.
In development I typically run Spark locally
We execute Spark jobs from a Play application but we don't use
spark-submit. I don't know if you really want to use spark-submit, but if
not you can just create a SparkContext programmatically in your app.
In development I typically run Spark locally. Creating the Spark context is
pretty trivial:
Hi -
Has anybody figured out how to integrate a Play application with Spark and run
it on a Spark cluster using spark-submit script? I have seen some blogs about
creating a simple Play app and running it locally on a dev machine with sbt run
command. However, those steps don't work for Spark-su
building-apache-spark-with-play-framework/
> and
> it worked for me. The project is here -
> https://github.com/knoldus/Play-Spark-Scala
>
> Regards,
> Manu
>
>
> On Sat, Aug 16, 2014 at 11:04 PM, Sujee Maniyam wrote:
>
>> Hi
>>
>> I am tryi
Hi,
I tried the Spark(1.0.0)+Play(2.3.3) example from the Knoldus blog -
http://blog.knoldus.com/2014/06/18/play-with-spark-building-apache-spark-with-play-framework/
and
it worked for me. The project is here -
https://github.com/knoldus/Play-Spark-Scala
Regards,
Manu
On Sat, Aug 16, 2014 at
Hi
I am trying to connect to Spark from Play framework. Getting the following
Akka error...
[ERROR] [08/16/2014 17:12:05.249]
[spark-akka.actor.default-dispatcher-3] [ActorSystem(spark)] Uncaught
fatal error from thread [spark-akka.actor.default-dispatcher-3]
shutting down ActorSystem [spark
16 matches
Mail list logo