RE: Play framework

2014-10-16 Thread Mohammed Guller
To: Mohammed Guller Cc: US Office Admin; Surendranauth Hiraman; Daniel Siegmann; user@spark.apache.org Subject: Re: Play framework In our case, Play libraries are not required to run spark jobs. Hence they are available only on master and play runs as a regular scala application. I can't

Re: Play framework

2014-10-16 Thread Ramaraju Indukuri
16, 2014 7:05 PM > *To:* Mohammed Guller; Surendranauth Hiraman > *Cc:* Daniel Siegmann; user@spark.apache.org > *Subject:* Re: Play framework > > > > The remaining dependencies (Spark libraries) are available for the context > from the sparkhome. I have installed spark such t

RE: Play framework

2014-10-16 Thread Mohammed Guller
; Surendranauth Hiraman Cc: Daniel Siegmann; user@spark.apache.org Subject: Re: Play framework The remaining dependencies (Spark libraries) are available for the context from the sparkhome. I have installed spark such that all the slaves to have same sparkhome. Code looks like this. val conf = new

Re: Play framework

2014-10-16 Thread US Office Admin
ent: Thursday, October 16, 2014 4:00 PM To: US Office Admin; Surendranauth Hiraman Cc: Daniel Siegmann; user@spark.apache.org Subject: RE: Play framework Thanks, Suren and Raju. Raju – if I remember correctly, Play package command just creates a jar for your app. That jar file will not include

RE: Play framework

2014-10-16 Thread Mohammed Guller
To: Mohammed Guller Cc: US Office Admin; Surendranauth Hiraman; Daniel Siegmann; user@spark.apache.org Subject: Re: Play framework Hi, Below is the link for a simple Play + SparkSQL example - http://blog.knoldus.com/2014/07/14/play-with-spark-building-apache-spark-with-play-framework-part-3

Re: Play framework

2014-10-16 Thread Manu Suryavansh
Hiraman; Mohammed Guller > > *Cc:* Daniel Siegmann; user@spark.apache.org > *Subject:* Re: Play framework > > > > ​We integrated Spark into Play and use SparkSQL extensively on an ec2 > spark cluster on Hadoop hdfs 1.2.1 and tachyon 0.4. > > > > Step 1: Create a play scala app

RE: Play framework

2014-10-16 Thread Mohammed Guller
From: Surendranauth Hiraman mailto:suren.hira...@velos.io>> Sent: Thursday, October 16, 2014 12:42 PM To: Mohammed Guller Cc: Daniel Siegmann; user@spark.apache.org<mailto:user@spark.apache.org> Subject: Re: Play framework Mohammed, Jumping in for Daniel,

Re: Play framework

2014-10-16 Thread US Office Admin
From: Surendranauth Hiraman Sent: Thursday, October 16, 2014 12:42 PM To: Mohammed Guller Cc: Daniel Siegmann; user@spark.apache.org Subject: Re: Play framework Mohammed, Jumping in for Daniel, we actually address the configuration issue by pulling values from environment variables

Re: Play framework

2014-10-16 Thread Surendranauth Hiraman
n the build.scala file for my play project is incorrect. > > > > Mohammed > > > > *From:* Daniel Siegmann [mailto:daniel.siegm...@velos.io] > *Sent:* Thursday, October 16, 2014 7:15 AM > *To:* Mohammed Guller > *Cc:* user@spark.apache.org > *Subject:* Re: Play frame

Re: Play framework

2014-10-16 Thread Debasish Das
l cannot get rid of Akka related exceptions. I suspect that > the settings in the build.scala file for my play project is incorrect. > > > > Mohammed > > > > *From:* Daniel Siegmann [mailto:daniel.siegm...@velos.io] > *Sent:* Thursday, October 16, 2014 7:15 AM &

RE: Play framework

2014-10-16 Thread Mohammed Guller
@spark.apache.org Subject: Re: Play framework We execute Spark jobs from a Play application but we don't use spark-submit. I don't know if you really want to use spark-submit, but if not you can just create a SparkContext programmatically in your app. In development I typically run Spark locally

Re: Play framework

2014-10-16 Thread Daniel Siegmann
We execute Spark jobs from a Play application but we don't use spark-submit. I don't know if you really want to use spark-submit, but if not you can just create a SparkContext programmatically in your app. In development I typically run Spark locally. Creating the Spark context is pretty trivial: