Hey AJ,
I have tried to run on a cluster yet, only on local mode.
I'll try to get something running on a cluster soon and keep you posted.
Nicolas Garneau
> On May 4, 2014, at 6:23 PM, Ajay Nair wrote:
>
> Now I got it to work .. well almost. However I needed to copy the project/
> folder to t
Now I got it to work .. well almost. However I needed to copy the project/
folder to the spark-standalone folder as the package build was failing
because it could not find buil properties. After the copy the build was
successful. However when I run it I get errors but it still gives me the
output.
Hey AJ,
If you plan to launch your job on a cluster, consider using the spark-submit
command.
Running this in the spark's home directory gives you a help on how to use this:
$ ./bin/spark-submit
I haven't tried it yet but considering this post, it will be the preferred way
to launch jobs:
http
Thank you. I am trying this now
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459p6472.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
Hey AJ,
As I can see your path when running sbt is:
> $root/spark/spark-examples: sbt/sbt package
You should be within the app's folder that contains the simple.sbt, which is
spark-standalone/;
> $root/spark/spark-examples/spark-standalone: sbt/sbt package
> $root/spark/spark-examples/spark-st
Quick question, where should I place your folder. Inside the spark directory.
My Spark directory is in /root/spark
So currently I tried pulling your github code in /root/spark/spark-examples
and modified my home spark directory in the scala code.
I copied the sbt folder within the spark-examples fo
Thank you. Let me try this quickly !
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459p6463.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
Sorry, the link went wrong. I meant here:
https://github.com/ngarneau/spark-standalone
Le 2014-05-03 à 13:23, Nicolas Garneau a écrit :
> Hey AJ,
>
> I created a little sample app using the spark's quick start.
> Have a look here.
> Assuming you used scala, using sbt is good for running your ap
Thank you for the reply. Have you posted a link from where I follow the steps
?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459p6462.html
Sent from the Apache Spark Developers List mailing list archive
Hey AJ,
I created a little sample app using the spark's quick start.
Have a look here.
Assuming you used scala, using sbt is good for running your application in
standalone mode.
The configuration file which is "simple.sbt" in my repo, holds all the
dependencies needed to build your app.
Hope t
Hi AJ,
You might find this helpful -
http://blog.cloudera.com/blog/2014/04/how-to-run-a-simple-apache-spark-app-in-cdh-5/
-Sandy
On Sat, May 3, 2014 at 8:42 AM, Ajay Nair wrote:
> Hi,
>
> I have written a code that works just about fine in the spark shell on EC2.
> The ec2 script helped me co
11 matches
Mail list logo