To: Mohammed Guller
Cc: US Office Admin; Surendranauth Hiraman; Daniel Siegmann;
user@spark.apache.org
Subject: Re: Play framework
In our case, Play libraries are not required to run spark jobs. Hence they are
available only on master and play runs as a regular scala application. I can't
16, 2014 7:05 PM
> *To:* Mohammed Guller; Surendranauth Hiraman
> *Cc:* Daniel Siegmann; user@spark.apache.org
> *Subject:* Re: Play framework
>
>
>
> The remaining dependencies (Spark libraries) are available for the context
> from the sparkhome. I have installed spark such t
; Surendranauth Hiraman
Cc: Daniel Siegmann; user@spark.apache.org
Subject: Re: Play framework
The remaining dependencies (Spark libraries) are available for the context from
the sparkhome. I have installed spark such that all the slaves to have same
sparkhome. Code looks like this.
val conf = new
ent: Thursday, October 16, 2014 4:00 PM
To: US Office Admin; Surendranauth Hiraman
Cc: Daniel Siegmann; user@spark.apache.org
Subject: RE: Play framework
Thanks, Suren and Raju.
Raju – if I remember correctly, Play package command just creates a jar for
your app. That jar file will not include
To: Mohammed Guller
Cc: US Office Admin; Surendranauth Hiraman; Daniel Siegmann;
user@spark.apache.org
Subject: Re: Play framework
Hi,
Below is the link for a simple Play + SparkSQL example -
http://blog.knoldus.com/2014/07/14/play-with-spark-building-apache-spark-with-play-framework-part-3
Hiraman; Mohammed Guller
>
> *Cc:* Daniel Siegmann; user@spark.apache.org
> *Subject:* Re: Play framework
>
>
>
> We integrated Spark into Play and use SparkSQL extensively on an ec2
> spark cluster on Hadoop hdfs 1.2.1 and tachyon 0.4.
>
>
>
> Step 1: Create a play scala app
From: Surendranauth Hiraman
mailto:suren.hira...@velos.io>>
Sent: Thursday, October 16, 2014 12:42 PM
To: Mohammed Guller
Cc: Daniel Siegmann; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: Play framework
Mohammed,
Jumping in for Daniel,
From: Surendranauth Hiraman
Sent: Thursday, October 16, 2014 12:42 PM
To: Mohammed Guller
Cc: Daniel Siegmann; user@spark.apache.org
Subject: Re: Play framework
Mohammed,
Jumping in for Daniel, we actually address the configuration issue by pulling
values from environment variables
n the build.scala file for my play project is incorrect.
>
>
>
> Mohammed
>
>
>
> *From:* Daniel Siegmann [mailto:daniel.siegm...@velos.io]
> *Sent:* Thursday, October 16, 2014 7:15 AM
> *To:* Mohammed Guller
> *Cc:* user@spark.apache.org
> *Subject:* Re: Play frame
l cannot get rid of Akka related exceptions. I suspect that
> the settings in the build.scala file for my play project is incorrect.
>
>
>
> Mohammed
>
>
>
> *From:* Daniel Siegmann [mailto:daniel.siegm...@velos.io]
> *Sent:* Thursday, October 16, 2014 7:15 AM
&
@spark.apache.org
Subject: Re: Play framework
We execute Spark jobs from a Play application but we don't use spark-submit. I
don't know if you really want to use spark-submit, but if not you can just
create a SparkContext programmatically in your app.
In development I typically run Spark locally
We execute Spark jobs from a Play application but we don't use
spark-submit. I don't know if you really want to use spark-submit, but if
not you can just create a SparkContext programmatically in your app.
In development I typically run Spark locally. Creating the Spark context is
pretty trivial:
12 matches
Mail list logo